Incorporating Skew into RMS Surface Roughness Probability Distribution
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Application of random match probability calculations to mixed STR profiles.
Bille, Todd; Bright, Jo-Anne; Buckleton, John
2013-03-01
Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.
Airborne Surface Profiling of Alaskan Glaciers
National Oceanic and Atmospheric Administration, Department of Commerce — This data set consists of glacier outline, laser altimetry profile, and surface elevation change data for 46 glaciers in Alaska and British Columbia, Canada,...
Sawosz, P; Kacprzak, M; Weigl, W; Borowska-Solonynko, A; Krajewski, P; Zolek, N; Ciszek, B; Maniewski, R; Liebert, A
2012-12-07
A time-gated intensified CCD camera was applied for time-resolved imaging of light penetrating in an optically turbid medium. Spatial distributions of light penetration probability in the plane perpendicular to the axes of the source and the detector were determined at different source positions. Furthermore, visiting probability profiles of diffuse reflectance measurement were obtained by the convolution of the light penetration distributions recorded at different source positions. Experiments were carried out on homogeneous phantoms, more realistic two-layered tissue phantoms based on the human skull filled with Intralipid-ink solution and on cadavers. It was noted that the photons visiting probability profiles depend strongly on the source-detector separation, the delay between the laser pulse and the photons collection window and the complex tissue composition of the human head.
Ryan, K; Williams, D Gareth; Balding, David J
2016-11-01
Many DNA profiles recovered from crime scene samples are of a quality that does not allow them to be searched against, nor entered into, databases. We propose a method for the comparison of profiles arising from two DNA samples, one or both of which can have multiple donors and be affected by low DNA template or degraded DNA. We compute likelihood ratios to evaluate the hypothesis that the two samples have a common DNA donor, and hypotheses specifying the relatedness of two donors. Our method uses a probability distribution for the genotype of the donor of interest in each sample. This distribution can be obtained from a statistical model, or we can exploit the ability of trained human experts to assess genotype probabilities, thus extracting much information that would be discarded by standard interpretation rules. Our method is compatible with established methods in simple settings, but is more widely applicable and can make better use of information than many current methods for the analysis of mixed-source, low-template DNA profiles. It can accommodate uncertainty arising from relatedness instead of or in addition to uncertainty arising from noisy genotyping. We describe a computer program GPMDNA, available under an open source licence, to calculate LRs using the method presented in this paper. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Surface glycosylation profiles of urine extracellular vesicles.
Directory of Open Access Journals (Sweden)
Jared Q Gerlach
Full Text Available Urinary extracellular vesicles (uEVs are released by cells throughout the nephron and contain biomolecules from their cells of origin. Although uEV-associated proteins and RNA have been studied in detail, little information exists regarding uEV glycosylation characteristics. Surface glycosylation profiling by flow cytometry and lectin microarray was applied to uEVs enriched from urine of healthy adults by ultracentrifugation and centrifugal filtration. The carbohydrate specificity of lectin microarray profiles was confirmed by competitive sugar inhibition and carbohydrate-specific enzyme hydrolysis. Glycosylation profiles of uEVs and purified Tamm Horsfall protein were compared. In both flow cytometry and lectin microarray assays, uEVs demonstrated surface binding, at low to moderate intensities, of a broad range of lectins whether prepared by ultracentrifugation or centrifugal filtration. In general, ultracentrifugation-prepared uEVs demonstrated higher lectin binding intensities than centrifugal filtration-prepared uEVs consistent with lesser amounts of co-purified non-vesicular proteins. The surface glycosylation profiles of uEVs showed little inter-individual variation and were distinct from those of Tamm Horsfall protein, which bound a limited number of lectins. In a pilot study, lectin microarray was used to compare uEVs from individuals with autosomal dominant polycystic kidney disease to those of age-matched controls. The lectin microarray profiles of polycystic kidney disease and healthy uEVs showed differences in binding intensity of 6/43 lectins. Our results reveal a complex surface glycosylation profile of uEVs that is accessible to lectin-based analysis following multiple uEV enrichment techniques, is distinct from co-purified Tamm Horsfall protein and may demonstrate disease-specific modifications.
Surface tension profiles in vertical soap films
Adami, N.; Caps, H.
2015-01-01
Surface tension profiles in vertical soap films are experimentally investigated. Measurements are performed by introducing deformable elastic objets in the films. The shape adopted by those objects once set in the film is related to the surface tension value at a given vertical position by numerically solving the adapted elasticity equations. We show that the observed dependency of the surface tension versus the vertical position is predicted by simple modeling that takes into account the mechanical equilibrium of the films coupled to previous thickness measurements.
Gallo, J C; Thomas, E; Novick, G E; Herrera, R J
1997-01-01
DNA typing for forensic identification is a two-step process. The first step involves determining the profiles of samples collected at the crime scene and comparing them with the profiles obtained from suspects and the victims. In the case of a match that includes the suspect as the potential source of the material collected at the crime scene, the last step in the process is to answer the question, what is the likelihood that someone in addition to the suspect could match the profile of the sample studied? This likelihood is calculated by determining the frequency of the suspect's profile in the relevant population databases. The design of forensic databases and the criteria for comparison has been addressed by the NRC report of 1996 (National Research Council, 1996). However, the fact that geographical proximity, migrational patterns, and even cultural and social practices have effects on subpopulation structure establishes the grounds for further study into its effects on the calculation of probability of occurrence values. The issue becomes more relevant in the case of discrete polymorphic markers that show higher probability of occurrence in the reference populations, where several orders of magnitude difference between the databases may have an impact on the jury. In this study, we calculated G values for all possible pairwise comparisons of allelic frequencies in the different databases from the races or subpopulations examined. In addition, we analyzed a set of 24 unrelated Caucasian, 37 unrelated African-American, and 96 unrelated Sioux/Chippewa individuals for seven polymorphic loci (DQA1, LDLR, GYPA, HBGG, D7S8, GC, and D1S80). All three sets of individuals where sampled from Minnesota. The probability of occurrence for all seven loci were calculated with respect to nine different databases: Caucasian, Arabic, Korean, Sioux/Chippewa, Navajo, Pueblo, African American, Southeastern Hispanic, and Southwestern Hispanic. Analysis of the results demonstrated
Equilibrium drop surface profiles in electric fields
Mugele, F.; Buehrle, J.
2007-09-01
Electrowetting is becoming a more and more frequently used tool to manipulate liquids in various microfluidic applications. On the scale of the entire drop, the effect of electrowetting is to reduce the apparent contact angle of partially wetting conductive liquids upon application of an external voltage. Microscopically, however, strong electric fields in the vicinity of the three phase contact line give rise to local deformations of the drop surface. We determined the equilibrium surface profile using a combined numerical, analytical, and experimental approach. We find that the local contact angle in electrowetting is equal to Young's angle independent of the applied voltage. Only on the scale of the thickness of the insulator and beyond does the surface slope assume a value consistent with the voltage-dependent apparent contact angle. This behaviour is verified experimentally by determining equilibrium surface profiles for insulators of various thicknesses between 10 and 250 µm. Numerically and analytically, we find that the local surface curvature diverges algebraically upon approaching the contact line with an exponent -1<μ<0. We discuss the relevance of the local surface properties for dynamic aspects of the contact line motion.
Sticking probability for hydrogen atoms on the surface of liquid /sup 4/He
Energy Technology Data Exchange (ETDEWEB)
Zimmerman, D.S.; Berlinsky, A.J.
1983-03-01
A calculation is presented of the sticking probability for hydrogen atoms colliding with a liquid /sup 4/He surface. The calculation is based on a model potential for the H-liquid /sup 4/He interaction which is used to derive both bound and free atom wave function and the linear H atom-ripplon coupling. Results are presented in terms of the energy and angle dependent sticking probability s(E,THETA) and the thermally averaged probability s(T), and comparison is made to the experimental results s(T)=0.035 +-0.00 for 0.18 < T < 0.27 K.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Surface sticking probabilities for sputtered atoms of Nb-93 and Rh-103
Weller, M. R.; Tombrello, T. A.
1979-01-01
The capture coefficient probabilities for sputtered atoms of Nb-93 and Rh-103 incident on Al2O3 surfaces were measured using the backscattering of MeV heavy ions. In the circumstance where the collecting surface is thickly covered, the sticking probabilities integrated over the energy distribution of sputtered atoms are 0.97 plus or minus 0.01 for Nb-93 and 0.95 plus or minus 0.01 for Rh-103 respectively. In the limit of negligible areal coverage of the collector, the accuracy is less; in this case the sticking probabilities are 0.97 + 0.03 or -0.08 and 0.95 + 0.05 or -0.08.
A note on the probability distribution function of the surface electromyogram signal.
Nazarpour, Kianoush; Al-Timemy, Ali H; Bugmann, Guido; Jackson, Andrew
2013-01-01
The probability density function (PDF) of the surface electromyogram (EMG) signals has been modelled with Gaussian and Laplacian distribution functions. However, a general consensus upon the PDF of the EMG signals is yet to be reached, because not only are there several biological factors that can influence this distribution function, but also different analysis techniques can lead to contradicting results. Here, we recorded the EMG signal at different isometric muscle contraction levels and characterised the probability distribution of the surface EMG signal with two statistical measures: bicoherence and kurtosis. Bicoherence analysis did not help to infer the PDF of measured EMG signals. In contrast, with kurtosis analysis we demonstrated that the EMG PDF at isometric, non-fatiguing, low contraction levels is super-Gaussian. Moreover, kurtosis analysis showed that as the contraction force increases the surface EMG PDF tends to a Gaussian distribution. Copyright © 2012 Elsevier Inc. All rights reserved.
Bistatic-radar estimation of surface-slope probability distributions with applications to the moon.
Parker, M. N.; Tyler, G. L.
1973-01-01
A method for extracting surface-slope frequency distributions from bistatic-radar data has been developed and applied to the lunar surface. Telemetry transmissions from orbiting Apollo spacecraft were received on the earth after reflection from the lunar surface. The echo-frequency spectrum was related analytically to the probability distribution of lunar slopes. Standard regression techniques were used to solve the inverse problem of finding slope distributions from observed echo-frequency spectra. Data taken simultaneously at two wavelengths, 13 and 116 cm, have yielded diverse slope statistics.
Probability distribution of surface wind speed induced by convective adjustment on Venus
Yamamoto, Masaru
2017-03-01
The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.
Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z
2015-08-01
Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.
Tang, Z; Brandt, A; Callahan, N B; Clayton, S M; Currie, S A; Ito, T M; Makela, M; Masuda, Y; Morris, C L; Pattie, R; Ramsey, J C; Salvat, D J; Saunders, A; Young, A R
2015-01-01
We report a measurement of the spin-flip probabilities for ultracold neutrons interacting with surfaces coated with nickel phosphorus. For 50 $\\mu$m thick nickel phosphorus coated on stainless steel, the spin-flip probability per bounce was found to be $\\beta_{\\rm NiP\\;on\\;SS} = (3.3^{+1.8}_{-5.6}) \\times 10^{-6}$. For 50 $\\mu$m thick nickel phosphorus coated on aluminum, the spin-flip probability per bounce was found to be $\\beta_{\\rm NiP\\;on\\;Al} = (3.6^{+2.1}_{-5.9}) \\times 10^{-6}$. For the copper guide used as reference, the spin flip probability per bounce was found to be $\\beta_{\\rm Cu} = (6.7^{+5.0}_{-2.5}) \\times 10^{-6}$. Nickel phosphorus coated stainless steel or aluminum provides a solution when UCN guides that have a high Fermi potential and are low-cost, mechanically robust, and non-depolarizing are needed.
Otero, Federico; Norte, Federico; Araneo, Diego
2018-01-01
The aim of this work is to obtain an index for predicting the probability of occurrence of zonda event at surface level from sounding data at Mendoza city, Argentine. To accomplish this goal, surface zonda wind events were previously found with an objective classification method (OCM) only considering the surface station values. Once obtained the dates and the onset time of each event, the prior closest sounding for each event was taken to realize a principal component analysis (PCA) that is used to identify the leading patterns of the vertical structure of the atmosphere previously to a zonda wind event. These components were used to construct the index model. For the PCA an entry matrix of temperature ( T) and dew point temperature (Td) anomalies for the standard levels between 850 and 300 hPa was build. The analysis yielded six significant components with a 94 % of the variance explained and the leading patterns of favorable weather conditions for the development of the phenomenon were obtained. A zonda/non-zonda indicator c can be estimated by a logistic multiple regressions depending on the PCA component loadings, determining a zonda probability index \\widehat{c} calculable from T and Td profiles and it depends on the climatological features of the region. The index showed 74.7 % efficiency. The same analysis was performed by adding surface values of T and Td from Mendoza Aero station increasing the index efficiency to 87.8 %. The results revealed four significantly correlated PCs with a major improvement in differentiating zonda cases and a reducing of the uncertainty interval.
Otero, Federico; Norte, Federico; Araneo, Diego
2016-10-01
The aim of this work is to obtain an index for predicting the probability of occurrence of zonda event at surface level from sounding data at Mendoza city, Argentine. To accomplish this goal, surface zonda wind events were previously found with an objective classification method (OCM) only considering the surface station values. Once obtained the dates and the onset time of each event, the prior closest sounding for each event was taken to realize a principal component analysis (PCA) that is used to identify the leading patterns of the vertical structure of the atmosphere previously to a zonda wind event. These components were used to construct the index model. For the PCA an entry matrix of temperature (T) and dew point temperature (Td) anomalies for the standard levels between 850 and 300 hPa was build. The analysis yielded six significant components with a 94 % of the variance explained and the leading patterns of favorable weather conditions for the development of the phenomenon were obtained. A zonda/non-zonda indicator c can be estimated by a logistic multiple regressions depending on the PCA component loadings, determining a zonda probability index widehat{c} calculable from T and Td profiles and it depends on the climatological features of the region. The index showed 74.7 % efficiency. The same analysis was performed by adding surface values of T and Td from Mendoza Aero station increasing the index efficiency to 87.8 %. The results revealed four significantly correlated PCs with a major improvement in differentiating zonda cases and a reducing of the uncertainty interval.
Surface, segregation profile for Ni50Pd50(100)
DEFF Research Database (Denmark)
Christensen, Asbjørn; Ruban, Andrei; Skriver, Hans Lomholt
1997-01-01
A recent dynamical LEED study [G.N. Derry, C.B. McVey, P.J. Rous, Surf. Sci. 326 (1995) 59] reported an oscillatory surface segregation profile in the Ni50Pd50(100) system with the surface layer enriched by Pd. We have performed ab-initio total-energy calculations for the surface of this alloy...... system using the coherent potential approximation and obtain an oscillatory segregation profile, in agreement with experiments. We discuss the energetic origin of the oscillatory segregation profile in terms of effective cluster interactions. We include relaxation effects by means of the semi...
Inman, Keith; Rudin, Norah; Cheng, Ken; Robinson, Chris; Kirschner, Adam; Inman-Semerau, Luke; Lohmueller, Kirk E
2015-09-18
Technological advances have enabled the analysis of very small amounts of DNA in forensic cases. However, the DNA profiles from such evidence are frequently incomplete and can contain contributions from multiple individuals. The complexity of such samples confounds the assessment of the statistical weight of such evidence. One approach to account for this uncertainty is to use a likelihood ratio framework to compare the probability of the evidence profile under different scenarios. While researchers favor the likelihood ratio framework, few open-source software solutions with a graphical user interface implementing these calculations are available for practicing forensic scientists. To address this need, we developed Lab Retriever, an open-source, freely available program that forensic scientists can use to calculate likelihood ratios for complex DNA profiles. Lab Retriever adds a graphical user interface, written primarily in JavaScript, on top of a C++ implementation of the previously published R code of Balding. We redesigned parts of the original Balding algorithm to improve computational speed. In addition to incorporating a probability of allelic drop-out and other critical parameters, Lab Retriever computes likelihood ratios for hypotheses that can include up to four unknown contributors to a mixed sample. These computations are completed nearly instantaneously on a modern PC or Mac computer. Lab Retriever provides a practical software solution to forensic scientists who wish to assess the statistical weight of evidence for complex DNA profiles. Executable versions of the program are freely available for Mac OSX and Windows operating systems.
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul Sf; Zhu, Tingshao
2015-01-01
Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric "Screening Efficiency" that were adopted to evaluate model effectiveness. Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Individuals in China with high suicide
Profiles of US and CT imaging features with a high probability of appendicitis
van Randen, A.; Laméris, W.; van Es, H.W.; ten Hove, W.; Bouma, W.H.; van Leeuwen, M.S.; van Keulen, E.M.; van der Hulst, V.P.M.; Henneman, O.D.; Bossuyt, P.M.; Boermeester, M.A.; Stoker, J.
2010-01-01
To identify and evaluate profiles of US and CT features associated with acute appendicitis. Consecutive patients presenting with acute abdominal pain at the emergency department were invited to participate in this study. All patients underwent US and CT. Imaging features known to be associated with
Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences
DEFF Research Database (Denmark)
Hansen, Søren; Blanke, Mogens
2013-01-01
desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired......Diagnosis of actuator faults is crucial for aircraft since loss of actuation can have catastrophic consequences. For autonomous aircraft the steps necessary to achieve fault tolerance is limited when only basic and non-redundant sensor and actuators suites are present. Through diagnosis...... that exploits analytical redundancies it is, nevertheless, possible to cheaply enhance the level of safety. This paper presents a method for diagnosing control surface faults by using basic sensors and hardware available on an autonomous aircraft. The capability of fault diagnosis is demonstrated obtaining...
PROFILE OF ACADEMICALLY BACKWARD STUDENTS AND PROBABLE CONTRIBUTING FACTORS: A QUALITATIVE ANALYSIS
Ubhale Ashish; Javadekar
2014-01-01
CONTEXT: Academic backwardness exists almost in every institute. In institutes providing professional education, the problem is on higher scale with more seriousness and complications. For the academic backwardness, multiple reasons like financial, family, personal stressors are contemplated. This study is an attempt to recognize the problem areas in these students which will help us to take some preventive measures. AIMS: To find the probable causes of repeated failure in...
Profiles of US and CT imaging features with a high probability of appendicitis
Energy Technology Data Exchange (ETDEWEB)
Randen, A. van; Lameris, W. [University of Amsterdam, Department of Radiology, Academic Medical Center, Amsterdam (Netherlands); University of Amsterdam, Department of Surgery, Academic Medical Center, Amsterdam (Netherlands); Es, H.W. van [St Antonius Hospital, Department of Radiology, Nieuwegein (Netherlands); Hove, W. ten; Bouma, W.H. [Gelre Hospitals, Department of Surgery, Apeldoorn (Netherlands); Leeuwen, M.S. van [University Medical Centre, Department of Radiology, Utrecht (Netherlands); Keulen, E.M. van [Tergooi Hospitals, Department of Radiology, Hilversum (Netherlands); Hulst, V.P.M. van der [Onze Lieve Vrouwe Gasthuis, Department of Radiology, Amsterdam (Netherlands); Henneman, O.D. [Bronovo Hospital, Department of Radiology, The Hague (Netherlands); Bossuyt, P.M. [University of Amsterdam, Department of Clinical Epidemiology, Biostatistics, and Bioinformatics, Academic Medical Center, Amsterdam (Netherlands); Boermeester, M.A. [University of Amsterdam, Department of Surgery, Academic Medical Center, Amsterdam (Netherlands); Stoker, J. [University of Amsterdam, Department of Radiology, Academic Medical Center, Amsterdam (Netherlands)
2010-07-15
To identify and evaluate profiles of US and CT features associated with acute appendicitis. Consecutive patients presenting with acute abdominal pain at the emergency department were invited to participate in this study. All patients underwent US and CT. Imaging features known to be associated with appendicitis, and an imaging diagnosis were prospectively recorded by two independent radiologists. A final diagnosis was assigned after 6 months. Associations between appendiceal imaging features and a final diagnosis of appendicitis were evaluated with logistic regression analysis. Appendicitis was assigned to 284 of 942 evaluated patients (30%). All evaluated features were associated with appendicitis. Imaging profiles were created after multivariable logistic regression analysis. Of 147 patients with a thickened appendix, local transducer tenderness and peri-appendiceal fat infiltration on US, 139 (95%) had appendicitis. On CT, 119 patients in whom the appendix was completely visualised, thickened, with peri-appendiceal fat infiltration and appendiceal enhancement, 114 had a final diagnosis of appendicitis (96%). When at least two of these essential features were present on US or CT, sensitivity was 92% (95% CI 89-96%) and 96% (95% CI 93-98%), respectively. Most patients with appendicitis can be categorised within a few imaging profiles on US and CT. When two of the essential features are present the diagnosis of appendicitis can be made accurately. (orig.)
Exact analytical density profiles and surface tension
Indian Academy of Sciences (India)
to nonideality, which distinguish electrolyte from nonelectrolyte solutions. An example is provided by the excess surface tension for an air–water interface, which is determined by the excess particle density, and which was first calculated by Onsager and Samaras. Because of the discrepancy between the dielectric constants ...
Elfer, N.; Meibaum, R.; Olsen, G.
1995-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
Teaching time-series analysis. II. Wave height and water surface elevation probability distributions
Whitford, Dennis J.; Waters, Jennifer K.; Vieira, Mario E. C.
2001-04-01
This paper describes the second of a two-part series of pedagogical exercises to introduce students to methods of time-series analysis. While these exercises are focused on the analysis of wind generated surface gravity waves, they are cross-disciplinary in nature and can be applied to other fields dealing with random signal analysis. Two computer laboratory exercises are presented which enable students to understand many of the facets of random signal analysis with less difficulty and more understanding than standard classroom instruction alone. The first pedagogical exercise, described in the previous article, uses mathematical software on which the students execute the manual arithmetic operations of a finite Fourier analysis on a complex wave record. The results are then compared to those obtained by a fast Fourier transform. This article, the second of this two-part pedagogical series, addresses analysis of a complex sea using observed and theoretical wave height and water surface elevation probability distributions and wave spectra. These results are compared to a fast Fourier transform analysis, thus providing a link back to the first exercise.
Optimized Estimation of Surface Layer Characteristics from Profiling Measurements
Directory of Open Access Journals (Sweden)
Doreene Kang
2016-01-01
Full Text Available New sampling techniques such as tethered-balloon-based measurements or small unmanned aerial vehicles are capable of providing multiple profiles of the Marine Atmospheric Surface Layer (MASL in a short time period. It is desirable to obtain surface fluxes from these measurements, especially when direct flux measurements are difficult to obtain. The profiling data is different from the traditional mean profiles obtained at two or more fixed levels in the surface layer from which surface fluxes of momentum, sensible heat, and latent heat are derived based on Monin-Obukhov Similarity Theory (MOST. This research develops an improved method to derive surface fluxes and the corresponding MASL mean profiles of wind, temperature, and humidity with a least-squares optimization method using the profiling measurements. This approach allows the use of all available independent data. We use a weighted cost function based on the framework of MOST with the cost being optimized using a quasi-Newton method. This approach was applied to seven sets of data collected from the Monterey Bay. The derived fluxes and mean profiles show reasonable results. An empirical bias analysis is conducted using 1000 synthetic datasets to evaluate the robustness of the method.
Energy Technology Data Exchange (ETDEWEB)
Toor, N.; Franz, E.; Liber, K. [Saskatchewan Univ., Saskatoon, SK (Canada); Han, X.; Martin, J. [Alberta Univ., Edmonton, AB (Canada); MacKinnon, M. [Syncrude Canada Ltd., Edmonton, AB (Canada)
2009-07-01
This presentation reported on a study that evaluated the potential for the biodegradation and associated reduction in aquatic toxicity of oil sands process-affected waters (OSPWs) using flow-through laboratory wetland microcosms. Changes in the composition of naphthenic acids (NAs) over a period of 52 weeks were also identified. OSPWs from Syncrude and Suncor were used in the experiments. The 2 types of OSPWs were enriched with nitrogen and phosphorus and had short and long hydraulic retention times (HRTs) of 40 and 400 days. HPLC/QTOF analysis was used to track changes in NA mixture profiles in each treatment over time. The biodegradation of NAs in Suncor OSPW was considerably faster than that of Syncrude OSPW. The biodegradation of NAs in both sources of OSPW was enhanced under longer HRTs, but the influence of nutrient addition was minimal. NAs that had the lowest degrees of cyclization and lowest carbon number were found to degrade faster, which is consistent with previous trends observed for aerobic microbial degradation of NAs using laboratory incubations. The 3 most persistent fractions of NA homologues were also identified within the NA mixture fingerprint, which may explain the lack of correlation between the mostly unchanged toxicological response as measured by Microtox and the 78 per cent reduction in total NA concentration over the study period.
Optical surface profiling of orb-web spider capture silks
Energy Technology Data Exchange (ETDEWEB)
Kane, D M; Joyce, A M; Staib, G R [Department of Physics, Macquarie University, Sydney, NSW 2109 (Australia); Herberstein, M E, E-mail: deb.kane@mq.edu.a [Department of Biological Sciences, Macquarie University, Sydney, NSW 2109 (Australia)
2010-09-15
Much spider silk research to date has focused on its mechanical properties. However, the webs of many orb-web spiders have evolved for over 136 million years to evade visual detection by insect prey. It is therefore a photonic device in addition to being a mechanical device. Herein we use optical surface profiling of capture silks from the webs of adult female St Andrews cross spiders (Argiope keyserlingi) to successfully measure the geometry of adhesive silk droplets and to show a bowing in the aqueous layer on the spider capture silk between adhesive droplets. Optical surface profiling shows geometric features of the capture silk that have not been previously measured and contributes to understanding the links between the physical form and biological function. The research also demonstrates non-standard use of an optical surface profiler to measure the maximum width of a transparent micro-sized droplet (microlens).
Toor, Navdeep S; Han, Xiumei; Franz, Eric; MacKinnon, Michael D; Martin, Jonathan W; Liber, Karsten
2013-10-01
The toxicity of oil sands process-affected waters (OSPW) from the Athabasca Oil Sands (AOS) in northern Alberta, Canada, is related to a relatively persistent group of dissolved organic acids known as naphthenic acids (NAs). Naphthenic acids are a complex mixture of carboxylic acids, with a general formula C(n)H(2n+Z)O2, where n indicates the carbon number and Z specifies the number of rings in the molecule. The present study is the first to evaluate the potential for the selective biodegradation of NAs and the associated reduction in aquatic toxicity of 2 OSPWs, maintained under 2 different hydraulic retention times and increased nutrient availability (nitrate and phosphate), using flow-through laboratory wetland microcosms over a 52-wk test period. High-performance liquid chromatography/quadrupole time of flight-mass spectrometry analysis was used to track the changes in NA mixture profiles, or "fingerprints," in each treatment over time. Based on first-order degradation kinetics, more rapid degradation was observed for NAs that had lower carbon numbers and fewer degrees of cyclization (NA congeners with carbon numbers 11-16 and Z series -2 to -4; half-lives between 19 and 28 wk). Within the NA mixture fingerprints, the 2 most persistent groups of homologues were also identified (NAs with carbon numbers 17-20 and Z series -6 to -12; half-lives between 37 and 52 wk). The persistence of this group of NAs may aid in explaining the residual chronic toxicological response as measured by the Microtox bioassay (effective concentration for 20%), after the degradation of the more labile fractions of NA mixtures in OSPW. © 2013 SETAC.
PROFFIT: Analysis of X-ray surface-brightness profiles
Eckert, Dominique
2016-08-01
PROFFIT analyzes X-ray surface-brightness profiles for data from any X-ray instrument. It can extract surface-brightness profiles in circular or elliptical annuli, using constant or logarithmic bin size, from the image centroid, the surface-brightness peak, or any user-given center, and provides surface-brightness profiles in any circular or elliptical sectors. It offers background map support to extract background profiles, can excise areas using SAO DS9-compatible (ascl:0003.002) region files to exclude point sources, provides fitting with a number of built-in models, including the popular beta model, double beta, cusp beta, power law, and projected broken power law, uses chi-squared or C statistic, and can fit on the surface-brightness or counts data. It has a command-line interface similar to HEASOFT’s XSPEC (ascl:9910.005) package, provides interactive help with a description of all the commands, and results can be saved in FITS, ROOT or TXT format.
Surface activity, lipid profiles and their implications in cervical cancer.
Directory of Open Access Journals (Sweden)
Preetha A
2005-01-01
Full Text Available Background: The profiles of lipids in normal and cancerous tissues may differ revealing information about cancer development and progression. Lipids being surface active, changes in lipid profiles can manifest as altered surface activity profiles. Langmuir monolayers offer a convenient model for evaluating surface activity of biological membranes. Aims: The aims of this study were to quantify phospholipids and their effects on surface activity of normal and cancerous human cervical tissues as well as to evaluate the role of phosphatidylcholine (PC and sphingomyelin (SM in cervical cancer using Langmuir monolayers. Methods and Materials: Lipid quantification was done using thin layer chromatography and phosphorus assay. Surface activity was evaluated using Langmuir monolayers. Monolayers were formed on the surface of deionized water by spreading tissue organic phase corresponding to 1 mg of tissue and studying their surface pressure-area isotherms at body temperature. The PC and SM contents of cancerous human cervical tissues were higher than those of the normal human cervical tissues. Role of PC and SM were evaluated by adding varying amounts of these lipids to normal cervical pooled organic phase. Statistical analysis: Student′s t-test (p < 0.05 and one-way analysis of variance (ANOVA was used. Results: Our results reveals that the phosphatidylglycerol level in cancerous cervical tissue was nearly five folds higher than that in normal cervical tissue. Also PC and sphingomyelin SM were found to be the major phospholipid components in cancerous and normal cervical tissues respectively. The addition of either 1.5 µg DPPC or 0.5 µg SM /mg of tissue to the normal organic phase changed its surface activity profile to that of the cancerous tissues. Statistically significant surface activity parameters showed that PC and SM have remarkable roles in shifting the normal cervical lipophilic surface activity towards that of cancerous lipophilic
Carbon nanotube oscillator surface profiling device and method of use
Popescu, Adrian [Tampa, FL; Woods, Lilia M [Tampa, FL; Bondarev, Igor V [Fuquay Varina, NC
2011-11-15
The proposed device is based on a carbon nanotube oscillator consisting of a finite length outer stationary nanotube and a finite length inner oscillating nanotube. Its main function is to measure changes in the characteristics of the motion of the carbon nanotube oscillating near a sample surface, and profile the roughness of this surface. The device operates in a non-contact mode, thus it can be virtually non-wear and non-fatigued system. It is an alternative to the existing atomic force microscope (AFM) tips used to scan surfaces to determine their roughness.
Fatty acid methyl ester profiles of bat wing surface lipids.
Pannkuk, Evan L; Fuller, Nathan W; Moore, Patrick R; Gilmore, David F; Savary, Brett J; Risch, Thomas S
2014-11-01
Sebocytes are specialized epithelial cells that rupture to secrete sebaceous lipids (sebum) across the mammalian integument. Sebum protects the integument from UV radiation, and maintains host microbial communities among other functions. Native glandular sebum is composed primarily of triacylglycerides (TAG) and wax esters (WE). Upon secretion (mature sebum), these lipids combine with minor cellular membrane components comprising total surface lipids. TAG and WE are further cleaved to smaller molecules through oxidation or host enzymatic digestion, resulting in a complex mixture of glycerolipids (e.g., TAG), sterols, unesterified fatty acids (FFA), WE, cholesteryl esters, and squalene comprising surface lipid. We are interested if fatty acid methyl ester (FAME) profiling of bat surface lipid could predict species specificity to the cutaneous fungal disease, white nose syndrome (WNS). We collected sebaceous secretions from 13 bat spp. using Sebutape(®) and converted them to FAME with an acid catalyzed transesterification. We found that Sebutape(®) adhesive patches removed ~6× more total lipid than Sebutape(®) indicator strips. Juvenile eastern red bats (Lasiurus borealis) had significantly higher 18:1 than adults, but 14:0, 16:1, and 20:0 were higher in adults. FAME profiles among several bat species were similar. We concluded that bat surface lipid FAME profiling does not provide a robust model predicting species susceptibility to WNS. However, these results provide baseline data that can be used for lipid roles in future ecological studies, such as life history, diet, or migration.
Lodder, W J; Schijven, J F; Rutjes, S A; de Roda Husman, A M; Teunis, P F M
2015-05-15
Numerous studies have reported quantitative data on viruses in surface waters generated using different methodologies. In the current study, the impact of the use of either cell culture-based or molecular-based methods in quantitative microbial risk assessment was assessed. Previously and newly generated data on the presence of infectious human enteroviruses (HEV) and enterovirus and parechovirus RNA were used to estimate distributions of virus concentrations in surface waters. Because techniques for the detection of infectious human parechoviruses (HPeV) in surface waters were not available, a 'Parallelogram Approach' was used to estimate their concentrations based on the ratio infectious HEV/HEV RNA. The obtained virus concentrations were then used to estimate the probability of exposure for children during recreation in such virus contaminated surface waters. Human enterovirus cell culture/PCR ratios ranged from 2.3 × 10(-3) to 0.28. This broad range of ratios indicates that care should be taken in assuming a fixed ratio for assessing the risk with PCR based virus concentrations. The probabilities of exposure to both enteroviruses and parechoviruses were calculated, using our Parallelogram Approach for the calculation of infectious parechoviruses. For both viruses it was observed that the detection method significantly influenced the probability of exposure. Based on the calculated culture data, PCR data, and the ingestion volume, it was estimated that the mean probabilities of exposure, of recreating children, to surface water containing viruses were 0.087 (infectious enteroviruses), 0.71 (enterovirus particles), 0.28 (parechovirus particles) and 0.025 (calculated infectious parechoviruses) per recreation event. The mean probabilities of exposure of children recreating in surface water from which drinking water is produced to infectious enteroviruses were estimated for nine locations and varied between 1.5 × 10(-4) - 0.09 per recreation event. In this study
Poom-Medina, José Luis; Álvarez-Borrego, Josué
2016-07-01
Theoretical relationships of statistical properties of surface slope from statistical properties of the image intensity in remotely sensed images, considering a non-Gaussian probability density function of the surface slope, are shown. Considering a variable detector line of sight angle and considering ocean waves moving along a single direction and that the observer and the sun are both in the vertical plane containing this direction, new expressions, using two different glitter functions, between the variance of the intensity of the image and the variance of the surface slopes are derived. In this case, skewness and kurtosis moments are taken into account. However, new expressions between correlation functions of the intensities in the image and surface slopes are numerically analyzed; for this case, the skewness moments were considered only. It is possible to observe more changes in these statistical relationships when the Rect function is used. The skewness and kurtosis values are in direct relation with the wind velocity on the sea surface.
Zipser, Edward J.; Lutz, Kurt R.
1994-01-01
Reflectivity data from Doppler radars are used to construct vertical profiles of radar reflectivity (VPRR) of convective cells in mesoscale convective systems (MCSs) in three different environmental regimes. The National Center for Atmospheric Research CP-3 and CP-4 radars are used to calculate median VPRR for MCSs in the Oklahoma-Kansas Preliminary Regional Experiment for STORM-Central in 1985. The National Oceanic and Atmospheric Administration-Tropical Ocean Global Atmosphere radar in Darwin, Australia, is used to calculate VPRR for MCSs observed both in oceanic, monsoon regimes and in continental, break period regimes during the wet seasons of 1987/88 and 1988/89. The midlatitude and tropical continental VPRRs both exhibit maximum reflectivity somewhat above the surface and have a gradual decrease in reflectivity with height above the freezing level. In sharp contrast, the tropical oceanic profile has a maximum reflectivity at the lowest level and a very rapid decrease in reflectivity with height beginning just above the freezing level. The tropical oceanic profile in the Darwin area is almost the same shape as that for two other tropical oceanic regimes, leading to the conclustion that it is characteristic. The absolute values of reflectivity in the 0 to 20 C range are compared with values in the literature thought to represent a threshold for rapid storm electrification leading to lightning, about 40 dBZ at -10 C. The large negative vertical gradient of reflectivity in this temperature range for oceanic storms is hypothesized to be a direct result of the characteristically weaker vertical velocities observed in MCSs over tropical oceans. It is proposed, as a necessary condition for rapid electrification, that a convective cell must have its updraft speed exceed some threshold value. Based upon field program data, a tentative estimate for the magnitude of this threshold is 6-7 m/s for mean speed and 10-12 m/s for peak speed.
Saito, I.; Nakano, T.; Tamba, J.
2017-10-01
A calibration apparatus for contact surface thermometers was developed. Temperature of the upper surface of a copper cube of the calibration apparatus was used as reference surface temperature, which was estimated at around 50 {°}C, 100 {°}C, and 150 {°}C by not only two conventional industrial platinum resistance thermometers (IPRTs) but also five small-sized platinum resistance thermometers (SSPRTs) calibrated based on the International Temperature Scale of 1990 (ITS-90). These thermometers were inserted horizontally into the copper cube and aligned along the center axis of the copper cube. In the case of a no-load state without anything on the upper surface, the temperature profile inside the copper cube linearly decreased from the lower part to the upper surface, which suggests that the heat conduction inside the copper cube can be regarded as a one-dimensional steady state. On the other hand, in the case of a transient state just after the contact surface thermometer was applied to the upper surface, the temperature profile became a round shape. We obtained good agreement between the curvature of the temperature profiles and the results estimated by using an error function used for a one-dimensional transient heat conduction problem. The temperature difference between the estimated temperature by linear extrapolation using two IPRTs and that by extrapolation using the error function was within 0.2 {°}C in the transient state at around 150 {°}C. Over 10 min after the contact surface thermometer was applied, the temperature profile showed a linear shape again, which indicated that linear extrapolation using two IPRTs was well for the estimation of the reference surface temperature because the heat conduction state inside the copper cube came back to the one-dimensional steady state. Difference between the surface temperature and temperature detected by the contact surface thermometer was also observed after the contact surface thermometer touched on the
Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with
Directory of Open Access Journals (Sweden)
Ching-Tai Chen
Full Text Available Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins and were tested on an independent dataset (consisting of 142 proteins. The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted
Space Debris Surfaces - Probability of no penetration versus impact velocity and obliquity
Elfer, N.; Meibaum, R.; Olsen, G.
1992-01-01
A collection of computer codes called Space Debris Surfaces (SD-SURF), have been developed to assist in the design and analysis of space debris protection systems. An SD-SURF analysis will show which obliquities and velocities are most likely to cause a penetration to help the analyst select a shield design best suited to the predominant penetration mechanism. Examples of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration are presented.
IceBridge UAF Lidar Profiler L1B Geolocated Surface Elevation Triplets
National Aeronautics and Space Administration — The NASA IceBridge UAF Lidar Profiler L1B Geolocated Surface Elevation Triplets data set contains surface profiles of Alaska Glaciers acquired using the airborne...
Schröter, Sandra; Gibson, Andrew R.; Kushner, Mark J.; Gans, Timo; O’Connell, Deborah
2018-01-01
The quantification and control of reactive species (RS) in atmospheric pressure plasmas (APPs) is of great interest for their technological applications, in particular in biomedicine. Of key importance in simulating the densities of these species are fundamental data on their production and destruction. In particular, data concerning particle-surface reaction probabilities in APPs are scarce, with most of these probabilities measured in low-pressure systems. In this work, the role of surface reaction probabilities, γ, of reactive neutral species (H, O and OH) on neutral particle densities in a He–H2O radio-frequency micro APP jet (COST-μ APPJ) are investigated using a global model. It is found that the choice of γ, particularly for low-mass species having large diffusivities, such as H, can change computed species densities significantly. The importance of γ even at elevated pressures offers potential for tailoring the RS composition of atmospheric pressure microplasmas by choosing different wall materials or plasma geometries.
Penno, A; Arumugam, M; Antweiler, G; Laubert, T; Habermann, J; Bruch, H-P
2012-09-01
Spinal anesthesia causes sympathetic blockade which leads to changes in the local temperature of the skin surface due to hyperemia. These changes in skin temperature were used in a newly developed method for estimating the level of analgesia. A total of 11 patients who were scheduled for surgical procedures of the lower extremities with symmetrical spinal anesthesia were included in the clinical study. By means of an electronic digital multi-channel body temperature measurement device with eight high precision temperature sensors placed on defined dermatomes, patient skin temperature was continuously measured at 2 s intervals and documented before, during and for 45 min after spinal anesthesia. Simultaneously, a neurological pin-prick test was carried on at regular intervals every 2 min on the defined dermatomes to calculate the correlation between the effects of analgesia and corresponding changes in skin temperature. The analyzed correlations showed that there is a minimum of 1.05°C temperature difference before and after spinal anesthesia especially on the lower extremities (foot, knee, inguinal) of patient dermatomes. The collected data of varying temperature differences were systematically evaluated using statistical software which led to a deeper understanding of the interdependency between temperature differences at different dermatomes. These interdependencies of temperature differences were used to develop a systematic analgesia level measurement algorithm. The algorithm calculates the skin temperature differences at specified dermatomes to find the accurate level of analgesia and also to find the forward and reverse progresses of analgesia. The developed mathematical method shows that it is possible to predict the level of analgesia up to an accuracy of 95% after spinal anesthesia. Therefore, it can be concluded that systematic processing of skin temperature data, collected at defined dermatomes can be used as a promising parameter for predicting
SURFACE BRIGHTNESS PROFILES OF DWARF GALAXIES. II. COLOR TRENDS AND MASS PROFILES
Energy Technology Data Exchange (ETDEWEB)
Herrmann, Kimberly A. [Penn State Mont Alto, 1 Campus Drive, Mont Alto, PA 17237 (United States); Hunter, Deidre A. [Lowell Observatory, 1400 West Mars Hill Road, Flagstaff, AZ 86001 (United States); Elmegreen, Bruce G., E-mail: kah259@psu.edu, E-mail: dah@lowell.edu, E-mail: bge@us.ibm.com [IBM T. J. Watson Research Center, 1101 Kitchawan Road, Yorktown Heights, NY 10598 (United States)
2016-06-01
In this second paper of a series, we explore the B − V , U − B , and FUV−NUV radial color trends from a multi-wavelength sample of 141 dwarf disk galaxies. Like spirals, dwarf galaxies have three types of radial surface brightness profiles: (I) single exponential throughout the observed extent (the minority), (II) down-bending (the majority), and (III) up-bending. We find that the colors of (1) Type I dwarfs generally become redder with increasing radius, unlike spirals which have a blueing trend that flattens beyond ∼1.5 disk scale lengths, (2) Type II dwarfs come in six different “flavors,” one of which mimics the “U” shape of spirals, and (3) Type III dwarfs have a stretched “S” shape where the central colors are flattish, become steeply redder toward the surface brightness break, then remain roughly constant beyond, which is similar to spiral Type III color profiles, but without the central outward bluing. Faint (−9 > M{sub B} > −14) Type II dwarfs tend to have continuously red or “U” shaped colors and steeper color slopes than bright (−14 > M{sub B} > −19) Type II dwarfs, which additionally have colors that become bluer or remain constant with increasing radius. Sm dwarfs and BCDs tend to have at least some blue and red radial color trend, respectively. Additionally, we determine stellar surface mass density (Σ) profiles and use them to show that the break in Σ generally remains in Type II dwarfs (unlike Type II spirals) but generally disappears in Type III dwarfs (unlike Type III spirals). Moreover, the break in Σ is strong, intermediate, and weak in faint dwarfs, bright dwarfs, and spirals, respectively, indicating that Σ may straighten with increasing galaxy mass. Finally, the average stellar surface mass density at the surface brightness break is roughly 1−2 M {sub ⊙} pc{sup −2} for Type II dwarfs but higher at 5.9 M {sub ⊙} pc{sup −2} or 27 M {sub ⊙} pc{sup −2} for
Simulation of a Random Profile of the Sea Surface
Weber, V. L.
2017-09-01
We consider the problem of simulation of the slope field of a random surface profile, which is represented as a sum of a finite number of sinusoids with random phases. The behavior of the correlation function of the slopes is studied for equidistant and nonequidistant locations of the nodes of the model-field spectrum on the frequency axis. A new node-location method, which is based on the equalization of the amplitudes of the spectral components of the actual slope field and ensures maximum proximity of the correlation functions of the model and actual fields over the entire region of their definition, is proposed. Using this method, one can significantly reduce the number of the summed harmonics during the simulation of the sea wind waves. The problem of fluctuations of the above-water irradiance is studied using the proposed slope-simulation method and, as a result, its application efficiency is proved.
National Oceanic and Atmospheric Administration, Department of Commerce — Profile curvature was calculated from the bathymetry surface for each raster cell using the ArcGIS 3D Analyst "Curvature" Tool. Profile curvature describes the rate...
Near surface profiles of HONO: The vegetated surface as source and sink
Sörgel, M.; Held, A.
2012-04-01
The photolysis of HONO is an important primary OH radical source. The OH radical is the most important oxidizing agent, the so called "detergent" of the atmosphere. HONO formation pathways are still unclear (e.g. Sörgel et al., 2011). Nevertheless, the main pathways are believed to be heterogeneous. Thus, the surface is proposed to be a major source. Furthermore, soil emissions of HONO due to microbiological activity in soil (Su et al., 2011) have been proposed. Therefore, we measured gradients of HONO, NO, NO2 and O3 close to the surface (0.1 to 1.6 m above ground). We used an automated, programmable moving inlet to measure at 3 or 5 heights between 0.1 m and 1.6 m above the ground. HONO, O3, NO and NO2 were measured simultaneously. HONO was measured with a long path absorption photometer (LOPAP), O3 by UV absorption and NO and NO2 by chemiluminescence with photolytic conversion of NO2. The time resolution of an individual LOPAP measurement was 3 min, and a full profile was measured within 30 min. Additionally, profiles of temperature and relative humidity as well as leaf wetness and j(NO2) were measured. Measurements were conducted above a clearing at the Waldstein field site of the University of Bayreuth in the Fichtelgebirge Mountains in south-east Germany. Preliminary results are presented. For example, during the day the highest values were often measured close to the ground, indicating emission of HONO at the surface. This also indicates that the daytime formation of HONO is heterogeneous or the emissions are due to microbiological activity (Su et al., 2011). During the night, the lowest values were often measured at the surface indicating deposition. Thus, HONO emissions as well as HONO deposition have been observed. The profile data will be analyzed with respect to light intensity, NO2 availability, atmospheric stability and surface wetness in order to elucidate the driving forces behind emission and deposition, respectively. Sörgel, M; Regelin, E; Bozem
Directory of Open Access Journals (Sweden)
P. J. Sheridan
2012-12-01
Full Text Available Between June 2006 and September 2009, an instrumented light aircraft measured over 400 vertical profiles of aerosol and trace gas properties over eastern and central Illinois. The primary objectives of this program were to (1 measure the in situ aerosol properties and determine their vertical and temporal variability and (2 relate these aircraft measurements to concurrent surface and satellite measurements. The primary profile location was within 15 km of the NOAA/ESRL surface aerosol monitoring station near Bondville, Illinois. Identical instruments at the surface and on the aircraft ensured that the data from both platforms would be directly comparable and permitted a determination of how representative surface aerosol properties were of the lower column. Aircraft profiles were also conducted occasionally at two other nearby locations to increase the frequency of A-Train satellite underflights for the purpose of comparing in situ and satellite-retrieved aerosol data. Measurements of aerosol properties conducted at low relative humidity over the Bondville site compare well with the analogous surface aerosol data and do not indicate any major sampling issues or that the aerosol is radically different at the surface compared with the lowest flyby altitude of ~ 240 m above ground level. Statistical analyses of the in situ vertical profile data indicate that aerosol light scattering and absorption (related to aerosol amount decreases substantially with increasing altitude. Parameters related to the nature of the aerosol (e.g., single-scattering albedo, Ångström exponent, etc., however, are relatively constant throughout the mixed layer, and do not vary as much as the aerosol amount throughout the profile. While individual profiles often showed more variability, the median in situ single-scattering albedo was 0.93–0.95 for all sampled altitudes. Several parameters (e.g., submicrometer scattering fraction, hemispheric backscattering fraction, and
Nyman, Petter; Metzen, Daniel; Noske, Phil; Duff, Thomas; Sheridan, Gary
2015-04-01
This study quantifies topographic effects on microclimate and moisture dynamics in litter and near surface soil with the aim to improve spatial representation of fine surface fuel moisture content (FFMC) in mountainous terrain where forest fires typically operate. FFMC was monitored at 30-minute intervals using a novel field method for measuring moisture content of litter, providing unique data on the spatial-temporal variation in FFMC throughout a fire season. Moisture sensors were inserted into litter packs at sites on different slope aspects (North, South, West and East) and paired with manual measurement of gravimetric water content to relate sensor output to water content. Hydrochron sensors (or iButtons) were placed within the litter packs, measuring temperature at the interface between the litter layer and the soil. During the monitoring period the mean daily moisture content in the litter layer ranged from 0.07-1.30 kg kg-1 on the north-facing slope and from 0.11-1.83 kg kg-1 on the south-facing slope. The number of days during the fire season when the litter was below the fiber saturation point (~0.35 kg kg -1) was 49 and 128 on the south and north aspects, respectively, highlighting the very large aspect-driven variation in FFMC and the need for spatially explicit data on microclimate. Differences in moisture content were caused by aspect-related variation in incoming radiation which resulted in large temperature differences within the litter layer. On the warmest day of the monitoring period (38.9° C on 17 January), for example, the difference in litter temperature between North and South aspect was 14° C. Differences in surface temperature were driven mainly by the systematic variation in vegetation cover, and hence shading, which emerge as a result of aspect (i.e. eco-hydrologic effect) and partly by the effects of slope orientation (i.e. geometric effect) on incoming radiation. Furthermore, the differences in FFMC due to evaporative demand were
Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.
2010-01-01
A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].
How well Can We Classify SWOT-derived Water Surface Profiles?
Frasson, R. P. M.; Wei, R.; Picamilh, C.; Durand, M. T.
2015-12-01
The upcoming Surface Water Ocean Topography (SWOT) mission will detect water bodies and measure water surface elevation throughout the globe. Within its continental high resolution mask, SWOT is expected to deliver measurements of river width, water elevation and slope of rivers wider than ~50 m. The definition of river reaches is an integral step of the computation of discharge based on SWOT's observables. As poorly defined reaches can negatively affect the accuracy of discharge estimations, we seek strategies to break up rivers into physically meaningful sections. In the present work, we investigate how accurately we can classify water surface profiles based on simulated SWOT observations. We assume that most river sections can be classified as either M1 (mild slope, with depth larger than the normal depth), or A1 (adverse slope with depth larger than the critical depth). This assumption allows the classification to be based solely on the second derivative of water surface profiles, with convex profiles being classified as A1 and concave profiles as M1. We consider a HEC-RAS model of the Sacramento River as a representation of the true state of the river. We employ the SWOT instrument simulator to generate a synthetic pass of the river, which includes our best estimates of height measurement noise and geolocation errors. We process the resulting point cloud of water surface heights with the RiverObs package, which delineates the river center line and draws the water surface profile. Next, we identify inflection points in the water surface profile and classify the sections between the inflection points. Finally, we compare our limited classification of simulated SWOT-derived water surface profile to the "exact" classification of the modeled Sacramento River. With this exercise, we expect to determine if SWOT observations can be used to find inflection points in water surface profiles, which would bring knowledge of flow regimes into the definition of river reaches.
Schijven, Jack F; Blaak, Hetty; Schets, Franciska M; de Roda Husman, Ana Maria
2015-01-01
The goal of this study was to determine the fate of ESBL-producing Escherichia coli (ESBL-EC) emitted from faecal sources in surface water, and the probability of human exposure through swimming. Concentrations of ESBL-EC were measured in recreational waters and in source waters, being water in
2005-01-01
Under funding from this proposal three in situ profile measurements of stratospheric sulfate aerosol and ozone were completed from balloon-borne platforms. The measured quantities are aerosol size resolved number concentration and ozone. The one derived product is aerosol size distribution, from which aerosol moments, such as surface area, volume, and extinction can be calculated for comparison with SAGE III measurements and SAGE III derived products, such as surface area. The analysis of these profiles and comparison with SAGE III extinction measurements and SAGE III derived surface areas are provided in Yongxiao (2005), which comprised the research thesis component of Mr. Jian Yongxiao's M.S. degree in Atmospheric Science at the University of Wyoming. In addition analysis continues on using principal component analysis (PCA) to derive aerosol surface area from the 9 wavelength extinction measurements available from SAGE III. Ths paper will present PCA components to calculate surface area from SAGE III measurements and compare these derived surface areas with those available directly from in situ size distribution measurements, as well as surface areas which would be derived from PCA and Thomason's algorithm applied to the four wavelength SAGE II extinction measurements.
On the extension of the wind profile over homogeneous terrain beyond the surface boundary layer
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Brümmer, B.
2007-01-01
Analysis of profiles of meteorological measurements from a 160 m high mast at the National Test Site for wind turbines at H phi vs phi re (Denmark) and at a 250 m high TV tower at Hamburg (Germany) shows that the wind profile based on surface-layer theory and Monin-Obukhov scaling is valid up to ...
The effects of snow grain size profile on the Greenland ice sheet snow surface melt
庭野, 匡思; 青木, 輝夫; 的場, 澄人; 山口, 悟; 谷川, 朋範; 山崎, 哲秀; 朽木, 勝幸; 本山, 秀明
2013-01-01
In July 2012, extreme surface melt events occurred on the Greenland Ice Sheet (GrIS). Generally, surface melt is physically controlled by the surface energy balance, where net shortwave radiant flux is the main energy source for melt during summer. Although (optically equivalent) snow grain size profile affects near-infrared albedo and in turn net shortwave radiant flux, its qualitative impacts on the surface melt events is unclear. In the present study we investigated effects of snow grain s...
Deep learning for galaxy surface brightness profile fitting
Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.; Domínguez Sánchez, H.; Dimauro, P.
2018-03-01
Numerous ongoing and future large area surveys (e.g. Dark Energy Survey, EUCLID, Large Synoptic Survey Telescope, Wide Field Infrared Survey Telescope) will increase by several orders of magnitude the volume of data that can be exploited for galaxy morphology studies. The full potential of these surveys can be unlocked only with the development of automated, fast, and reliable analysis methods. In this paper, we present DeepLeGATo, a new method for 2-D photometric galaxy profile modelling, based on convolutional neural networks. Our code is trained and validated on analytic profiles (HST/CANDELS F160W filter) and it is able to retrieve the full set of parameters of one-component Sérsic models: total magnitude, effective radius, Sérsic index, and axis ratio. We show detailed comparisons between our code and GALFIT. On simulated data, our method is more accurate than GALFIT and ˜3000 time faster on GPU (˜50 times when running on the same CPU). On real data, DeepLeGATo trained on simulations behaves similarly to GALFIT on isolated galaxies. With a fast domain adaptation step made with the 0.1-0.8 per cent the size of the training set, our code is easily capable to reproduce the results obtained with GALFIT even on crowded regions. DeepLeGATo does not require any human intervention beyond the training step, rendering it much automated than traditional profiling methods. The development of this method for more complex models (two-component galaxies, variable point spread function, dense sky regions) could constitute a fundamental tool in the era of big data in astronomy.
Seasonality in onshore normalized wind profiles above the surface layer
DEFF Research Database (Denmark)
Nissen, Jesper Nielsen; Gryning, Sven-Erik
2010-01-01
This work aims to study the seasonal difference in normalized wind speed above the surface layer as it is observed at the 160 m high mast at the coastal site Høvsøre at winds from the sea (westerly). Normalized and stability averaged wind speeds above the surface layer are observed to be 20 to 50...... is to reconstruct the seasonal signal in normalized wind speed and identify the physical process behind. The method proved reasonably successful in capturing the relative difference in wind speed between seasons, indicating that the simulated physical processes are likely candidates to the observed seasonal signal...... in normalized wind speed....
Plume and lithologic profiling with surface resistivity and seismic tomography
Energy Technology Data Exchange (ETDEWEB)
Watson, David B [ORNL; Doll, William E. [Battelle; Gamey, Jeff [Battelle; Sheehan, Jacob R [ORNL; Jardine, Philip M [ORNL
2005-03-01
Improved surface-based geophysical technologies that are commercially available provide a new level of detail that can be used to guide ground water remediation. Surface-based multielectrode resistivity methods and tomographic seismic refraction techniques were used to image to a depth of approximately 30 m below the surface at the Natural and Accelerated Bioremediation Research Field Research Center. The U.S. Department of Energy (DOE) established the research center on the DOE Oak Ridge Reservation in Oak Ridge, Tennessee, to conduct in situ field-scale studies on bioremediation of metals and radionuclides. Bioremediation studies are being conducted on the saprolite, shale bedrock, and ground water at the site that have been contaminated with nitrate, uranium, technetium, tetrachloroethylene, and other contaminants (U.S. DOE 1997). Geophysical methods were effective in imaging the high-ionic strength plume and in defining the transition zone between saprolite and bedrock zones that appears to have a significant influence on contaminant transport. The geophysical data were used to help select the location and depth of investigation for field research plots. Drilling, borehole geophysics, and ground water sampling were used to verify the surface geophysical studies.
Statistical Analysis of Surface Roughness and Dynamic Friction Profiles During Metalforming
Mates, Steven
2005-03-01
Laser confocal microscopy is used to image the surface roughness features of sheet metal before and after forming. This technique combines a statistically robust sampling protocol with fine-grained spatial resolution (approximately 100 nm) so that higher moments of the dynamic friction profiles and surface roughness profiles can be compared. These higher moments, including skew and kurtosis, are of interest because they characterize the extremes of the roughness distributions, which are thought to have a significant correlation with the overall friction behavior. Ultimately we seek an improved understanding of the relationship between surface roughness profiles, dynamic friction profiles, and metallurgical conditions in order to reliably predict the detailed friction behavior during actual metalforming operations.
A 3D Laser Profiling System for Rail Surface Defect Detection.
Xiong, Zhimin; Li, Qingquan; Mao, Qingzhou; Zou, Qin
2017-08-04
Rail surface defects such as the abrasion, scratch and peeling often cause damages to the train wheels and rail bearings. An efficient and accurate detection of rail defects is of vital importance for the safety of railway transportation. In the past few decades, automatic rail defect detection has been studied; however, most developed methods use optic-imaging techniques to collect the rail surface data and are still suffering from a high false recognition rate. In this paper, a novel 3D laser profiling system (3D-LPS) is proposed, which integrates a laser scanner, odometer, inertial measurement unit (IMU) and global position system (GPS) to capture the rail surface profile data. For automatic defect detection, first, the deviation between the measured profile and a standard rail model profile is computed for each laser-imaging profile, and the points with large deviations are marked as candidate defect points. Specifically, an adaptive iterative closest point (AICP) algorithm is proposed to register the point sets of the measured profile with the standard rail model profile, and the registration precision is improved to the sub-millimeter level. Second, all of the measured profiles are combined together to form the rail surface through a high-precision positioning process with the IMU, odometer and GPS data. Third, the candidate defect points are merged into candidate defect regions using the K-means clustering. At last, the candidate defect regions are classified by a decision tree classifier. Experimental results demonstrate the effectiveness of the proposed laser-profiling system in rail surface defect detection and classification.
Surface profile and stress field evaluation using digital gradient sensing method
Miao, C.; Sundaram, B. M.; Huang, L.; Tippur, H. V.
2016-09-01
Shape and surface topography evaluation from measured orthogonal slope/gradient data is of considerable engineering significance since many full-field optical sensors and interferometers readily output such a data accurately. This has applications ranging from metrology of optical and electronic elements (lenses, silicon wafers, thin film coatings), surface profile estimation, wave front and shape reconstruction, to name a few. In this context, a new methodology for surface profile and stress field determination based on a recently introduced non-contact, full-field optical method called digital gradient sensing (DGS) capable of measuring small angular deflections of light rays coupled with a robust finite-difference-based least-squares integration (HFLI) scheme in the Southwell configuration is advanced here. The method is demonstrated by evaluating (a) surface profiles of mechanically warped silicon wafers and (b) stress gradients near growing cracks in planar phase objects.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....
Yu, Han
2016-04-26
We demonstrate that diffraction stack migration can be used to discover the distribution of near-surface faults. The methodology is based on the assumption that near-surface faults generate detectable back-scattered surface waves from impinging surface waves. We first isolate the back-scattered surface waves by muting or FK filtering, and then migrate them by diffraction migration using the surface wave velocity as the migration velocity. Instead of summing events along trial quasi-hyperbolas, surface wave migration sums events along trial quasi-linear trajectories that correspond to the moveout of back-scattered surface waves. We have also proposed a natural migration method that utilizes the intrinsic traveltime property of the direct and the back-scattered waves at faults. For the synthetic data sets and the land data collected in Aqaba, where surface wave velocity has unexpected perturbations, we migrate the back-scattered surface waves with both predicted velocity profiles and natural Green\\'s function without velocity information. Because the latter approach avoids the need for an accurate velocity model in event summation, both the prestack and stacked migration images show competitive quality. Results with both synthetic data and field records validate the feasibility of this method. We believe applying this method to global or passive seismic data can open new opportunities in unveiling tectonic features.
Submicron Surface Vibration Profiling Using Doppler Self-Mixing Techniques
Directory of Open Access Journals (Sweden)
Tânia Pereira
2014-01-01
Full Text Available Doppler self-mixing laser probing techniques are often used for vibration measurement with very high accuracy. A novel optoelectronic probe solution is proposed, based on off-the-shelf components, with a direct reflection optical scheme for contactless characterization of the target’s movement. This probe was tested with two test bench apparatus that enhance its precision performance, with a linear actuator at low frequency (35 µm, 5–60 Hz, and its dynamics, with disc shaped transducers for small amplitude and high frequency (0.6 µm, 100–2500 Hz. The results, obtained from well-established signal processing methods for self-mixing Doppler signals, allowed the evaluation of vibration velocity and amplitudes with an average error of less than 10%. The impedance spectrum of piezoelectric (PZ disc target revealed a maximum of impedance (around 1 kHz for minimal Doppler shift. A bidimensional scan over the PZ disc surface allowed the categorization of the vibration mode (0, 1 and explained its deflection directions. The feasibility of a laser vibrometer based on self-mixing principles and supported by tailored electronics able to accurately measure submicron displacements was, thus, successfully demonstrated.
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Aspheric optical surface profiling based on laser scanning and auto-collimation
Xie, Hongbo; Jiang, Min; Wang, Yao; Pang, Xiaotian; Wang, Chao; Su, Yongpeng; Yang, Lei
2017-11-01
Nowadays the utilization of aspheric lenses has become more and more popular, enabling highly increased degree of freedom for optical design and simultaneously improving the performance of optical systems. Fast and accurate surface profiling of these aspheric components is a real demand in characterization and optimization of the optical systems. In this paper, a novel and simple surface profiler instrument is designed and developed to fulfill the ever increasing need of testing the axially symmetric aspheric surface. The proposed instrument is implemented based on a unique mapping between the position and rotation angle of the reflective mirror in optical path and the coordinate of reflection point on the surface during rapid laser beam scanning. High accuracy of the proposed surface profiling method is ensured by a high-resolution grating guide rail, indexing plate, and position sensitive detector based on laser auto-collimation and beam center-fitting. Testing the meridian line of both convex and concave surfaces has been experimentally demonstrated using the developed instrument. In comparison to tested results from conventional image measuring instruments and coordinate measuring machines, coefficient of determination better than 0.999 99 and RMS less than 1.5 μm have been achieved, which validates the feasibility of this method. Analysis on the systematic error is beneficial to further improve its measurement accuracy. The presented instrument—essentially builds on the geometrical optics technique—provides a powerful tool to measure the aspheric surfaces quickly and accurately with stable structure and simple algorithm.
Surface layer and bloom dynamics observed with the Prince William Sound Autonomous Profiler
Campbell, R. W.
2016-02-01
As part of a recent long term monitoring effort, deployments of a WETLabs Autonomous Moored Profiler (AMP) began Prince William Sound (PWS) in 2013. The PWS AMP consists of a positively buoyant instrument frame, with a winch and associated electronics that profiles the frame from a park depth (usually 55 m) to the surface by releasing and retrieving a thin UHMWPE tether; it generally conducts a daily cast and measures temperature, salinity, chlorophyll-a fluorescence, turbidity, and oxygen and nitrate concentrations. Upward and downward looking ADCPs are mounted on a float below the profiler, and an in situ plankton imager is in development and will be installed in 2016. Autonomous profilers are a relatively new technology, and early deployments experienced a number of failures from which valuable lessons may be learned. Nevertheless, an unprecedented time series of the seasonal biogeochemical procession in the surface waters coastal Gulf of Alaska was collected in 2014 and 2015. The northern Gulf of Alaska has experienced a widespread warm anomaly since early 2014, and surface layer temperature anomalies in PWS were strongly positive during winter 2014. The spring bloom observed by the profiler began 2-3 weeks earlier than average, with surface nitrate depleted by late April. Although surface temperatures were still above average in 2015, bloom timing was much later, with a short vigorous bloom in late April and a subsurface bloom in late May that coincided with significant nitrate drawdown. As well as the vernal blooms, wind-driven upwelling events lead to several small productivity pulses that were evident in changes in nitrate and oxygen concentrations, and chlorophyll-a fluorescence. As well as providing a mechanistic understanding of surface layer biogeochemistry, high frequency observations such as these put historical observations in context, and provide new insights into the scales of variability in the annual cycles of the surface ocean in the North
Directory of Open Access Journals (Sweden)
M. I. Belov
2017-01-01
Full Text Available Among the remote-sensing techniques the most efficient ones to detect oil films on the water surface are laser refection methods based on the record of reflected radiation from the water surface and fluorescent methods based on the record of laser-induced fluorescence radiation of the water surface. Laser equipment, for example, installed on the delivery aircraft, can be used regardless of the time of day in a fairly wide range of optical states of the Earth's atmosphere and detect pollution of the small-size oil-products.Laser reflection metods based on the record of reflected radiation from the water surface allow detection of oil films at high altitude of delivery aircraft (and, respectively, at the wideband spatial scanning on the surface of the water.The paper is concerned with development of the laser reflection method to detect the oil-product films on the sea surface, which uses eye-safe laser radiation wavelengths.The eyes safety requirement makes it necessary to choose between ultraviolet (0.18 - 0.38 μm and near-infrared (over 1.4 μm spectral ranges. The choice between these two options should be based on the efficient use of ultraviolet (0.18 - 0.38 μm and near infrared (over 1.4 μm ranges for detection of oil films on the water surface.The results of mathematical modeling show that for the oil film thickness more than 20 μm the detection probability is 100% for the sounding wavelength both of 0.355 μm and of 1.54 μm .However, for the thinner films of oil (a thickness of the oil films may be of units of μm or less the situation is different. In laser sounding at a wavelength of 0.355 μm, the laser reflection method allows reliable detection of the oil film, which have a thickness of 2 μm, at least, with an appropriate probability of the proper detection (more than 0.9 and the false alarm rate (less than 0.002 for the relative noise of measurement being no more than 5%.At the same time, in laser sounding at a wavelength
Park, Barratt; Krueger, Bastian C.; Meyer, Sven; Kandratsenka, Alexander; Wodtke, Alec; Schaefer, Tim
2017-06-01
The conversion of translational to rotational motion often plays a major role in the trapping of small molecules at surfaces, a crucial first step for a wide variety of chemical processes that occur at gas-surface interfaces. However, to date most quantum-state resolved surface scattering experiments have been performed on diatomic molecules, and very little detailed information is available about how the structure of non-linear polyatomic molecules influences the mechanisms for energy exchange with surfaces. In the current work, we employ a new rotationally-resolved 1+1' resonance-enhanced multiphoton ionization (REMPI) scheme to measure rotational distribution in formaldehyde molecules directly scattered from the Au(111) surface at incident kinetic energies in the range 0.3-1.2 eV. The results indicate a pronounced propensity to excite a-axis rotation (twirling) rather than b- or c-axis rotation (tumbling or cartwheeling), and are consistent with a rotational rainbow scattering model. Classical trajectory calculations suggest that the effect arises--to zeroth order--from the three-dimensional shape of the molecule (steric effects). The results have broad implications for the enhanced trapping probability of prolate and near-prolate molecules at surfaces.
DEFF Research Database (Denmark)
Nikunen, P.; Vattulainen, Ilpo Tapio; Ala-Nissila, T.
2001-01-01
(theta) to determine the locations of phase boundaries and find such data to be clearly time dependent during full spreading. We conclude that nonequilibrium effects seem to be an inherent feature in profile evolution studies of surface diffusion in all cases where ordering plays a prominent role. This warrants...
Ionic profiles close to dielectric discontinuities: Specific ion-surface interactions
Markovich, Tomer; Orland, Henri
2016-01-01
We study, by incorporating short-range ion-surface interactions, ionic profiles of electrolyte solutions close to a non-charged interface between two dielectric media. In order to account for important correlation effects close to the interface, the ionic profiles are calculated beyond mean-field theory, using the loop expansion of the free energy. We show how it is possible to overcome the well-known deficiency of the regular loop expansion close to the dielectric jump, and treat the non-linear boundary conditions within the framework of field theory. The ionic profiles are obtained analytically to one-loop order in the free energy, and their dependence on different ion-surface interactions is investigated. The Gibbs adsorption isotherm, as well as the ionic profiles are used to calculate the surface tension, in agreement with the reverse Hofmeister series. Consequently, from the experimentally-measured surface tension, one can extract a single adhesivity parameter, which can be used within our model to quan...
Bouquin, Alexandre Y. K.; Gil de Paz, Armando; Muñoz-Mateos, Juan Carlos; Boissier, Samuel; Sheth, Kartik; Zaritsky, Dennis; Peletier, Reynier F.; Knapen, Johan H.; Gallego, Jesús
2018-02-01
We present new spatially resolved surface photometry in the far-ultraviolet (FUV) and near-ultraviolet (NUV) from images obtained by the Galaxy Evolution Explorer (GALEX) and IRAC1 (3.6 μm) photometry from the Spitzer Survey of Stellar Structure in Galaxies (S4G). We analyze the radial surface brightness profiles μ FUV, μ NUV, and μ [3.6], as well as the radial profiles of (FUV ‑ NUV), (NUV ‑ [3.6]), and (FUV ‑ [3.6]) colors in 1931 nearby galaxies (z color distribution with those predicted by the chemo-spectrophotometric models for the evolution of galaxy disks of Boissier & Prantzos. The exponential disk component is best isolated by setting an inner radial cutoff and an upper surface brightness limit in stellar mass surface density. The best-fitting models to the measured scale length and central surface brightness values yield distributions of spin and circular velocity within a factor of two of those obtained via direct kinematic measurements. We find that at a surface brightness fainter than μ [3.6] = 20.89 mag arcsec‑2, or below 3 × 108 M ⊙ kpc‑2 in stellar mass surface density, the average specific star formation rate (sSFR) for star-forming and quiescent galaxies remains relatively flat with radius. However, a large fraction of GALEX Green Valley galaxies show a radial decrease in sSFR. This behavior suggests that an outside-in damping mechanism, possibly related to environmental effects, could be testimony of an early evolution of galaxies from the blue sequence of star-forming galaxies toward the red sequence of quiescent galaxies.
ANALYSIS OF THE SURFACE PROFILE AND ITS MATERIAL SHARE DURING THE GRINDING INCONEL 718 ALLOY
Directory of Open Access Journals (Sweden)
Martin Novák
2015-05-01
Full Text Available Grinding is still an important method for surface finishing. At FPTM JEPU research, which deals with this issue is conducted. Experiments are carried out with grinding various materials under different conditions and then selected components of the surface integrity are evaluated. They include roughness Ra, Rm and Rz, Material ratio curve (Abbott Firestone curve and also the obtained roundness. This article deals with grinding nickel Inconel 718 alloy, when selected cutting grinding conditions were used and subsequently the surface profile and the material ratio curve were measured and evaluated.
Profiles of ocean surface heating (POSH): A new model of upper ocean diurnal warming
Gentemann, Chelle L.; Minnett, Peter J.; Ward, Brian
2009-07-01
Shipboard radiometric measurements of diurnal warming at the ocean surface and profiles through the diurnal thermocline were utilized to assess the temporal and vertical variability and to develop a new physics-based model of near-surface warming. The measurements and modeled diurnal warming were compared, with the goal of comprehensively evaluating differences between the data and model results. On the basis of these results, the diurnal model was refined while attempting to maintain agreement with the measurements. Simplified bulk models commonly do not provide information on the vertical structure within the warm layer, but this new model predicts the vertical temperature profile within the diurnal thermocline using an empirically derived function dependent on wind speed. The vertical profile of temperature provides both a straightforward methodology for modeling differences due to diurnal warming between measurements made at different depths (e.g., in situ measurements at various depths and measurements of the surface temperatures by satellite radiometers) and information on upper ocean thermal structure. Additionally, the model estimates of diurnal warming at the ocean surface are important for air-sea heat and gas flux calculations, blending satellite sea surface temperature fields, and air-sea interaction studies.
DEFF Research Database (Denmark)
Stangegaard, Michael; Wang, Zhenyu; Kutter, Jörg Peter
2006-01-01
conventional methods to determine biocompatibility such as cellular growth rate, morphology and the hydrophobicity of the surfaces. HeLa cells grown on polymethylmethacrylate (PMMA) or a SU-8 surface treated with HNO3-ceric ammonium nitrate (HNO3-CAN) and ethanolamine showed no differences in growth rate......, morphology or gene expression profiles as compared to HeLa cells grown in cell culture flasks. Cells grown on SU-8 treated with only HNO3-CAN showed almost the same growth rate (36 ¡ 1 h) and similar morphology as cells grown in cell culture flasks (32 ¡ 1 h), indicating good biocompatibility. However, more...... than 200 genes showed different expression levels in cells grown on SU-8 treated with HNO3-CAN compared to cells grown in cell culture flasks. This shows that gene expression profiling is a simple and precise method for determining differences in cells grown on different surfaces that are otherwise...
Energy Technology Data Exchange (ETDEWEB)
Berlinsky, A.J.; Morrow, M.; Jochemsen, R.; Hardy, W.N. (British Columbia Univ., Vancouver (Canada). Dept. of Physics)
1982-07-01
High resolution magnetic resonance studies of the 1420 MHz transition in atomic hydrogen confined in a liquid helium coated container have been performed at temperatures 0.06
Gopinath, Abhay; Lim, Andre; Nagarajan, Balasubramanian; Cher Wong, Chow; Maiti, Rajarshi; Castagne, Sylvie
2016-11-01
Mechanical surface treatments such as Shot Peening (SP) and Deep Cold Rolling (DCR) are being used to introduce Compressive Residual Stress (CRS) at the surface and subsurface layers of aerospace components, respectively. This paper investigates the feasibility of a combined introduction of both the surface and sub-surface compressive residual stress on Ti6Al4V material through a successive application of the two aforementioned processes, one after the other. CRS profiles between individual processes were compared to that of combination of processes to validate the feasibility. It was found out that shot peening introduces surface compressive residual stress into the already deep cold rolled sample, resulting in both surface and sub-surface compressive residual stresses in the material. However the drawback of such a combination would be the increased surface roughness after shot peening a deep cold rolled sample which can be critical especially in compressor components. Hence, a new technology, Vibro-Peening (VP) may be used as an alternative to SP to introduce surface stress at reduced roughness.
Chi, Sheng; Lee, Shu-Sheng; Huang, Jen, Jen-Yu; Lai, Ti-Yu; Jan, Chia-Ming; Hu, Po-Chi
2016-04-01
As the progress of optical technologies, different commercial 3D surface contour scanners are on the market nowadays. Most of them are used for reconstructing the surface profile of mold or mechanical objects which are larger than 50 mm×50 mm× 50 mm, and the scanning system size is about 300 mm×300 mm×100 mm. There are seldom optical systems commercialized for surface profile fast scanning for small object size less than 10 mm×10 mm×10 mm. Therefore, a miniature optical system has been designed and developed in this research work for this purpose. Since the most used scanning method of such system is line scan technology, we have developed pseudo-phase shifting digital projection technology by adopting projecting fringes and phase reconstruction method. A projector was used to project a digital fringe patterns on the object, and the fringes intensity images of the reference plane and of the sample object were recorded by a CMOS camera. The phase difference between the plane and object can be calculated from the fringes images, and the surface profile of the object was reconstructed by using the phase differences. The traditional phase shifting method was accomplished by using PZT actuator or precisely controlled motor to adjust the light source or grating and this is one of the limitations for high speed scanning. Compared with the traditional optical setup, we utilized a micro projector to project the digital fringe patterns on the sample. This diminished the phase shifting processing time and the controlled phase differences between the shifted phases become more precise. Besides, the optical path design based on a portable device scanning system was used to minimize the size and reduce the number of the system components. A screwdriver section about 7mm×5mm×5mm has been scanned and its surface profile was successfully restored. The experimental results showed that the measurement area of our system can be smaller than 10mm×10mm, the precision reached to
DEFF Research Database (Denmark)
Lu, Kaiyuan; Rasmussen, Peter Omand; Ritchie, Ewen
2008-01-01
Accurate knowledge of the high frequency inductance profile plays an important role in many designs of sensorless controllers for Surface inductance. A special algorithm is used to decouple the cross-coupling effects between the d-axis and the q-axis, which allows Mounted Permanent Magnet (SMPM......) synchronous motors. This paper presents an AC+DC measurement method for determination of the d-axis and q-axis high frequency inductance profiles of SMPM synchronous motors. This method uses DC currents to set a desired magnetic working point on the motor laminations, and then superimpose balanced small AC...... signals to measure the incremental a separate determination of the d, q inductance profiles as functions of the d, q currents. Experimental results on a commercial SMPM motor using the proposed method are presented in this paper....
The role of surface chemistry in the cytotoxicity profile of graphene.
Majeed, Waqar; Bourdo, Shawn; Petibone, Dayton M; Saini, Viney; Vang, Kieng Bao; Nima, Zeid A; Alghazali, Karrer M; Darrigues, Emilie; Ghosh, Anindya; Watanabe, Fumiya; Casciano, Daniel; Ali, Syed F; Biris, Alexandru S
2017-04-01
Graphene and its derivative, because of their unique physical, electrical and chemical properties, are an important class of nanomaterials being proposed as foundational materials in nanomedicine as well as for a variety of industrial applications. A major limitation for graphene, when used in biomedical applications, is its poor solubility due to its rather hydrophobic nature. Therefore, chemical functionalities are commonly introduced to alter both its surface chemistry and biochemical activity. Here, we show that surface chemistry plays a major role in the toxicological profile of the graphene structures. To demonstrate this, we chemically increased the oxidation level of the pristine graphene and compared the corresponding toxicological effects along with those for the graphene oxide. X-ray photoelectron spectroscopy revealed that pristine graphene had the lowest amount of surface oxygen, while graphene oxide had the highest at 2.5% and 31%, respectively. Low and high oxygen functionalized graphene samples were found to have 6.6% and 24% surface oxygen, respectively. Our results showed a dose-dependent trend in the cytotoxicity profile, where pristine graphene was the most cytotoxic, with decreasing toxicity observed with increasing oxygen content. Increased surface oxygen also played a role in nanomaterial dispersion in water or cell culture medium over longer periods. It is likely that higher dispersity might result in graphene entering into cells as individual flakes ~1 nm thick rather than as more cytotoxic aggregates. In conclusion, changes in graphene's surface chemistry resulted in altered solubility and toxicity, suggesting that a generalized toxicity profile would be rather misleading. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
The Three-Point Sinuosity Method for Calculating the Fractal Dimension of Machined Surface Profile
Zhou, Yuankai; Li, Yan; Zhu, Hua; Zuo, Xue; Yang, Jianhua
2015-04-01
The three-point sinuosity (TPS) method is proposed to calculate the fractal dimension of surface profile accurately. In this method, a new measure, TPS is defined to present the structural complexity of fractal curves, and has been proved to follow the power law. Thus, the fractal dimension can be calculated through the slope of the fitted line in the log-log plot. The Weierstrass-Mandelbrot (W-M) fractal curves, as well as the real surface profiles obtained by grinding, sand blasting and turning, are used to validate the effectiveness of the proposed method. The calculation values are compared to those obtained from root-mean-square (RMS) method, box-counting (BC) method and variation method. The results show that the TPS method has the widest scaling region, the least fit error and the highest accuracy among the methods examined, which demonstrates that the fractal characteristics of the fractal curves can be well revealed by the proposed method.
Cure, David; Weller, Thomas; Miranda, Felix A.
2011-01-01
In this paper, a comparison between Jerusalem Cross (JC) and Square Patch (SP) based Frequency Selected Surfaces (FSS) for low profile antenna applications is presented. The comparison is aimed at understanding the performance of low profile antennas backed by high impedance surfaces. In particular, an end loaded planar open sleeve dipole (ELPOSD) antenna is examined due to the various parameters within its configuration, offering significant design flexibility and a wide operating bandwidth. Measured data of the antennas demonstrate that increasing the number of unit cells improves the fractional bandwidth. The antenna bandwidth increased from 0.8% to 1.8% and from 0.8% to 2.7% for the JC and SP structures, respectively. The number of unit cells was increased from 48 to 80 for the JC-FSS and from 24 to 48 for the SP-FSS.
Surface profiling of normally responding and nonreleasing basophils by flow cytometry
DEFF Research Database (Denmark)
Kistrup, Kasper; Poulsen, Lars Kærgaard; Jensen, Bettina Margrethe
Background Human basophils are granulocytes with the capacity to play important roles in allergy for example by releasing histamine when activated by cross-linking of their high affinity IgE receptors (FcRI). However, not all individuals have basophils responding with a histamine release after...... such activation and are therefore called nonreleasers. Intracellularly these basophils differ from responding cells by having a low level of the tyrosine kinase Syk, but studies of their surface profile are lacking. We have investigated the expression of FcRI, HLA-DR, CD16, CD63, CD69, CD117, CD123, CD124, CD203......c, C3aR, C5aR CCR3, FPR1, ST2, CRTH2 on anti-IgE respondsive and nonreleasing basophils by flow cytometry, thereby generating a surface profile of the two phenotypes. Methods Fresh buffy coat blood (
Surface profiling of normally responding and nonreleasing basophils by flow cytometry
DEFF Research Database (Denmark)
Kistrup, Kasper; Poulsen, Lars Kærgaard; Jensen, Bettina Margrethe
2012-01-01
Background Human basophils are granulocytes with the capacity to play important roles in allergy for example by releasing histamine when activated by cross-linking of their high affinity IgE receptors (Fc¿RI). However, not all individuals have basophils responding with a histamine release after...... such activation and are therefore called nonreleasers. Intracellularly these basophils differ from responding cells by having a low level of the tyrosine kinase Syk, but studies of their surface profile are lacking. We have investigated the expression of Fc¿RI, HLA-DR, CD16, CD63, CD69, CD117, CD123, CD124, CD203......c, C3aR, C5aR CCR3, FPR1, ST2, CRTH2 on anti-IgE respondsive and nonreleasing basophils by flow cytometry, thereby generating a surface profile of the two phenotypes. Methods Fresh buffy coat blood (
A graphical solution in CATIA for profiling end mill tool which generates a helical surface
Teodor, V. G.; Baroiu, N.; Berbinschi, S.; Susac, F.; Oancea, N.
2017-08-01
The generation of a helical flute, which belongs to a helical cylindrical surface with constant pitch, can be made using end mill tools. The tools on this type are easiest to make than the side mills and represent a less expensive solution. The end mill profiling may be done using the classical theorems of surfaces enveloping, analytical expressed, as Olivier theorem or Nikolaev method. In this paper is proposed an algorithm, developed in the CATIA design environment, for profiling such tool’s type. The proposed solution is intuitive, rigorous and fast due to the utilization of the graphical design environment capabilities. Numerical examples are considered in order to validate the quality of this method.
Surface profiling of normally responding and nonreleasing basophils by flow cytometry
DEFF Research Database (Denmark)
Kistrup, Kasper; Poulsen, Lars Kærgaard; Jensen, Bettina Margrethe
2012-01-01
a maximum release using flow cytometry, basophils, defined as FceRIa+CD3-CD14-CD19-CD56-,were analysed for surface expression of relevant markers. All samples were compensated and analysed in logicle display. All gates...... were set on FMO (fluorescence minus one) controls and results are given in geoMFI normalised to FMO control. Releaser and nonreleasers were compared using unpaired t test with Welch’s correction. Results Twelve donors were included, three of which were nonreleasers and characterized in regard...... expression on nonreleasers (p = 0.0609). Conclusion Surface profiling using 15 different receptors, illustrated no significant difference between normal and nonreleasing basophils....
Dai, L.; Sorkin, V.; Zhang, Y. W.
2017-04-01
We perform molecular dynamics simulations to investigate molecular structure alternation and friction behavior of heterogeneous polymer (perfluoropolyether) surfaces using a nanoscale probing tip (tetrahedral amorphous carbon). It is found that depending on the magnitude of the applied normal force, three regimes exist: the shallow depth-sensing (SDS), deep depth-sensing (DDS), and transitional depth-sensing (TDS) regimes; TDS is between SDS and DDS. In SDS, the tip is floating on the polymer surface and there is insignificant permanent alternation in the polymer structure due to largely recoverable atomic deformations, and the surface roughness profile can be accurately measured. In DDS, the tip is plowing through the polymer surface and there is significant permanent alternation in the molecular structure. In this regime, the lateral friction force rises sharply and fluctuates violently when overcoming surface pile-ups. In SDS, the friction can be described by a modified Amonton’s law including the adhesion effect; meanwhile, in DDS, the adhesion effect is negligible but the friction coefficient is significantly higher. The underlying reason for the difference in these regimes rests upon different contributions by the repulsion and attraction forces between the tip and polymer surfaces to the friction force. Our findings here reveal important insights into lateral depth-sensing on heterogeneous polymer surfaces and may help improve the precision of depth-sensing devices.
Directory of Open Access Journals (Sweden)
Stephen R Griffiths
Full Text Available Thermal properties of tree hollows play a major role in survival and reproduction of hollow-dependent fauna. Artificial hollows (nest boxes are increasingly being used to supplement the loss of natural hollows; however, the factors that drive nest box thermal profiles have received surprisingly little attention. We investigated how differences in surface reflectance influenced temperature profiles of nest boxes painted three different colors (dark-green, light-green, and white: total solar reflectance 5.9%, 64.4%, and 90.3% respectively using boxes designed for three groups of mammals: insectivorous bats, marsupial gliders and brushtail possums. Across the three different box designs, dark-green (low reflectance boxes experienced the highest average and maximum daytime temperatures, had the greatest magnitude of variation in daytime temperatures within the box, and were consistently substantially warmer than light-green boxes (medium reflectance, white boxes (high reflectance, and ambient air temperatures. Results from biophysical model simulations demonstrated that variation in diurnal temperature profiles generated by painting boxes either high or low reflectance colors could have significant ecophysiological consequences for animals occupying boxes, with animals in dark-green boxes at high risk of acute heat-stress and dehydration during extreme heat events. Conversely in cold weather, our modelling indicated that there are higher cumulative energy costs for mammals, particularly smaller animals, occupying light-green boxes. Given their widespread use as a conservation tool, we suggest that before boxes are installed, consideration should be given to the effect of color on nest box temperature profiles, and the resultant thermal suitability of boxes for wildlife, particularly during extremes in weather. Managers of nest box programs should consider using several different colors and installing boxes across a range of both orientations and
Griffiths, Stephen R; Rowland, Jessica A; Briscoe, Natalie J; Lentini, Pia E; Handasyde, Kathrine A; Lumsden, Linda F; Robert, Kylie A
2017-01-01
Thermal properties of tree hollows play a major role in survival and reproduction of hollow-dependent fauna. Artificial hollows (nest boxes) are increasingly being used to supplement the loss of natural hollows; however, the factors that drive nest box thermal profiles have received surprisingly little attention. We investigated how differences in surface reflectance influenced temperature profiles of nest boxes painted three different colors (dark-green, light-green, and white: total solar reflectance 5.9%, 64.4%, and 90.3% respectively) using boxes designed for three groups of mammals: insectivorous bats, marsupial gliders and brushtail possums. Across the three different box designs, dark-green (low reflectance) boxes experienced the highest average and maximum daytime temperatures, had the greatest magnitude of variation in daytime temperatures within the box, and were consistently substantially warmer than light-green boxes (medium reflectance), white boxes (high reflectance), and ambient air temperatures. Results from biophysical model simulations demonstrated that variation in diurnal temperature profiles generated by painting boxes either high or low reflectance colors could have significant ecophysiological consequences for animals occupying boxes, with animals in dark-green boxes at high risk of acute heat-stress and dehydration during extreme heat events. Conversely in cold weather, our modelling indicated that there are higher cumulative energy costs for mammals, particularly smaller animals, occupying light-green boxes. Given their widespread use as a conservation tool, we suggest that before boxes are installed, consideration should be given to the effect of color on nest box temperature profiles, and the resultant thermal suitability of boxes for wildlife, particularly during extremes in weather. Managers of nest box programs should consider using several different colors and installing boxes across a range of both orientations and shade profiles (i
Surface reflectance drives nest box temperature profiles and thermal suitability for target wildlife
Rowland, Jessica A.; Briscoe, Natalie J.; Lentini, Pia E.; Handasyde, Kathrine A.; Lumsden, Linda F.; Robert, Kylie A.
2017-01-01
Thermal properties of tree hollows play a major role in survival and reproduction of hollow-dependent fauna. Artificial hollows (nest boxes) are increasingly being used to supplement the loss of natural hollows; however, the factors that drive nest box thermal profiles have received surprisingly little attention. We investigated how differences in surface reflectance influenced temperature profiles of nest boxes painted three different colors (dark-green, light-green, and white: total solar reflectance 5.9%, 64.4%, and 90.3% respectively) using boxes designed for three groups of mammals: insectivorous bats, marsupial gliders and brushtail possums. Across the three different box designs, dark-green (low reflectance) boxes experienced the highest average and maximum daytime temperatures, had the greatest magnitude of variation in daytime temperatures within the box, and were consistently substantially warmer than light-green boxes (medium reflectance), white boxes (high reflectance), and ambient air temperatures. Results from biophysical model simulations demonstrated that variation in diurnal temperature profiles generated by painting boxes either high or low reflectance colors could have significant ecophysiological consequences for animals occupying boxes, with animals in dark-green boxes at high risk of acute heat-stress and dehydration during extreme heat events. Conversely in cold weather, our modelling indicated that there are higher cumulative energy costs for mammals, particularly smaller animals, occupying light-green boxes. Given their widespread use as a conservation tool, we suggest that before boxes are installed, consideration should be given to the effect of color on nest box temperature profiles, and the resultant thermal suitability of boxes for wildlife, particularly during extremes in weather. Managers of nest box programs should consider using several different colors and installing boxes across a range of both orientations and shade profiles (i
An analysis of type F2 software measurement standards for profile surface texture parameters
Todhunter, L. D.; Leach, R. K.; Lawes, S. D. A.; Blateyron, F.
2017-06-01
This paper reports on an in-depth analysis of ISO 5436 part 2 type F2 reference software for the calculation of profile surface texture parameters that has been performed on the input, implementation and output results of the reference software developed by the National Physical Laboratory (NPL), the National Institute of Standards and Technology (NIST) and Physikalisch-Technische Bundesanstalt (PTB). Surface texture parameters have been calculated for a selection of 17 test data files obtained from the type F1 reference data sets on offer from NPL and NIST. The surface texture parameter calculation results show some disagreements between the software methods of the National Metrology Institutes. These disagreements have been investigated further, and some potential explanations are given.
Mitusov, A. V.; Mitusova, O. E.; Wendt, J.; Dreibrodt, S.; Bork, H.-R.
2014-09-01
This article focuses on features of spatial distribution of colluvial (slope) deposits on a micro scale. These features were detected by the non-parametric rank correlation of Spearman (rS) between thickness of colluvial layers and morphometric variables (MVs) of the modern land surface. The strongest correlation was found between total thickness of colluvial layers and maximal catchment area (rS = 0.85). A negative correlation was observed between thicknesses of younger and older colluvial layers. Additionally, if young colluvial layers have a negative correlation with slope steepness (GA), relatively old buried colluvial layers have a positive correlation with GA. These facts indicate an inversion of the zones of actual matter accumulation due to transformation of the land surface in profile during long-term sedimentation. Vertical curvature (kv) characterises acceleration and deceleration of surface flow caused by the shape of the slope profile along flow lines. Based on this, it was expected that kv would have a direct impact on the accumulation of colluvium. However, in this study, the correlations between the thickness of colluvial deposits and kv were low. Functional relationships between colluvial accumulation and the shape of profiles along flow lines were reflected by correlations with GA. Based on these observations, it is assumed that the regional nature of surface flow velocity affects the shift between existing accumulation zones reflected by colluvial deposits and potential accumulation zones reflected by MVs. Signs of correlation coefficients between the thickness of colluvial deposits and curvatures reflect the tendency of increased colluvial depositions at three out of 12 local landforms of Shary's classification. These landforms are located in the valley bottom. The mean thickness of colluvial deposits at these three landforms was 167 ± 18.7 cm (error range = standard deviation); the other nine landforms show a mean thickness of 130.1 ± 34.1 cm.
Validation Campaigns for Sea Surface Wind and Wind Profile by Ground-Based Doppler Wind Lidar
Liu, Zhishen; Wu, Songhua; Song, Xiaoquan; Liu, Bingyi; Li, Zhigang
2010-12-01
According to the research frame of ESA-MOST DRAGON Cooperation Program (ID5291), Chinese partners from Ocean Remote Sensing Institute of Ocean University of China have carried out a serial of campaigns for ground-based lidar validations and atmospheric observations. ORSI/OUC Doppler wind lidar has been developed and deployed to accurately measure wind speed and direction over large areas in real time -- an application useful for ADM-Aeolus VAL/CAL, aviation safety, weather forecasting and sports. The sea surface wind campaigns were successfully accomplished at the Qingdao sailing competitions during the 29th Olympic Games. The lidar located at the seashore near the sailing field, and made a horizontal scan over the sea surface, making the wind measurement in real time and then uploading the data to the local meteorological station every 10 minutes. In addition to the sea surface wind campaigns, ORSI/OUC Doppler wind lidar was deployed on the wind profile observations for the China's Shenzhou 7 spacecraft landing zone weather campaigns in September 2008 in Inner Mongolia steppe. Wind profile was tracked by the mobile Doppler lidar system to help to predict the module's landing site. During above ground tests, validation lidar is tested to be able to provide an independent and credible measurement of radial wind speed, wind profile, 3D wind vector, aerosol- backscattering ratio, aerosol extinction coefficient, extinction-to-backscatter ratio in the atmospheric boundary layer and troposphere, sea surface wind vectors, which will be an independent and very effective validation tool for upcoming ADM-Aeolus project.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
Afanas’ev, V. P.; Gryazev, A. S.; Efremenko, D. S.; Kaplya, P. S.; Kuznetcova, A. V.
2017-12-01
Precise knowledge of the differential inverse inelastic mean free path (DIIMFP) and differential surface excitation probability (DSEP) of Tungsten is essential for many fields of material science. In this paper, a fitting algorithm is applied for extracting DIIMFP and DSEP from X-ray photoelectron spectra and electron energy loss spectra. The algorithm uses the partial intensity approach as a forward model, in which a spectrum is given as a weighted sum of cross-convolved DIIMFPs and DSEPs. The weights are obtained as solutions of the Riccati and Lyapunov equations derived from the invariant imbedding principle. The inversion algorithm utilizes the parametrization of DIIMFPs and DSEPs on the base of a classical Lorentz oscillator. Unknown parameters of the model are found by using the fitting procedure, which minimizes the residual between measured spectra and forward simulations. It is found that the surface layer of Tungsten contains several sublayers with corresponding Langmuir resonances. The thicknesses of these sublayers are proportional to the periods of corresponding Langmuir oscillations, as predicted by the theory of R.H. Ritchie.
Directory of Open Access Journals (Sweden)
Bin Qian
2014-06-01
Full Text Available Additive manufacturing of alumina by laser is a delicate process and small changes of processing parameters might cause less controlled and understood consequences. The real-time monitoring of temperature profiles, spectrum profiles and surface morphologies were evaluated in off-axial set-up for controlling the laser sintering of alumina ceramics. The real-time spectrometer and pyrometer were used for rapid monitoring of the thermal stability during the laser sintering process. An active illumination imaging system successfully recorded the high temperature melt pool and surrounding area simultaneously. The captured images also showed how the defects form and progress during the laser sintering process. All of these real-time monitoring methods have shown a great potential for on-line quality control during laser sintering of ceramics.
Zhang, Hanlin; Ren, Yongjie; Liu, Changjie; Zhu, Jigui
2014-07-10
High-speed surface profile measurement with high precision is crucial for target inspection and quality control. In this study, a laser scanner based on a single point laser triangulation displacement sensor and a high-speed rotating polygon mirror is proposed. The autosynchronized scanning scheme is introduced to alleviate the trade-off between the field of view and the range precision, which is the inherent deficiency of the conventional triangulation. The lateral synchronized flying spot technology has excellent characteristics, such as programmable and larger field of view, high immunity to ambient light or secondary reflections, high optical signal-to-noise ratio, and minimum shadow effect. Owing to automatic point-to-point laser power control, high accuracy and superior data quality are possible when measuring objects featuring varying surface characteristics even in demanding applications. The proposed laser triangulation scanner is validated using a laboratory-built prototype and practical considerations for design and implementation of the system are described, including speckle noise reduction method and real-time signal processing. A method for rapid and accurate calibration of the laser triangulation scanner using lookup tables is also devised, and the system calibration accuracy is generally smaller than ±0.025 mm. Experimental results are presented and show a broad application prospect for fast surface profile precision measurement.
Temperature-dependent electronic decay profiles in CZT: probe of bulk and surface properties
Kessick, Royal; Maupin, Hugh; Tepper, Gary C.; Szeles, Csaba
2003-01-01
The electronic performance of CZT-based gamma radiation spectrometers is governed by a synergism of bulk and surface properties. Compensation is used to increase the bulk resistivity of Cd1-xZnxTe (x~0.1), but the same electronic states that are introduced to increase the material resistivity can also trap charge and reduce the carrier lifetime. Electrical and mechanical surface defects introduced during or subsequent to crystal harvesting are also known to interfere with device performance. Using a contactless, pulsed laser microwave cavity perturbation technique, electronic decay profiles were studied in high pressure Bridgman CZT as a function of temperature. The electronic decay profile was found to depend very strongly on temperature and was modeled using a function consisting of two exponential terms with temperature-dependent amplitudes and time constants. The model was used to relate the observed temperature dependent decay kinetics in CZT to specific trap energies. It was found that, at low temperatures, the electronic decay process is dominated by a deep trap with an energy of approximately 0.69 +/- 0.1 eV from the band edge. As the temperature is increased, the charge trapping becomes dominated by a second trap with an energy of approximately 0.60 +/- 0.1 eV from the band edge. Surface damage introduces additional charge traps that significantly alter the decay kinetics particularly at low temperatures.
Characterizing land surface erosion from cesium-137 profiles in lake and reservoir sediments.
Zhang, Xinbao; Walling, Desmond E
2005-01-01
Recognition of the threat to the sustainable use of the earth's resources posed by soil erosion and associated off-site sedimentation has generated an increasing need for reliable information on global rates of soil loss. Existing methods of assessing rates of soil loss across large areas possess many limitations and there is a need to explore alternative approaches to characterizing land surface erosion at the regional and global scale. The downcore profiles of 137Cs activity available for numerous lakes and reservoirs located in different areas of the world can be used to provide information on land surface erosion within the upstream catchments. The rate of decline of 137Cs activity toward the surface of the sediment deposited in a lake or reservoir can be used to estimate the rate of surface lowering associated with eroding areas within the upstream catchment, and the concentration of 137Cs in recently deposited sediment provides a basis for estimating the relative importance of surface and channel, gully, and/or subsurface erosion as a source of the deposited sediment. The approach has been tested using 137Cs data from several lakes and reservoirs in southern England and China, spanning a wide range of specific suspended sediment yield. The results obtained are consistent with other independent evidence of erosion rates and sediment sources within the lake and reservoir catchments and confirm the validity of the overall approach. The approach appears to offer valuable potential for characterizing land surface erosion, particularly in terms of its ability to provide information on the rate of surface lowering associated with the eroding areas, rather than an average rate of lowering for the entire catchment surface.
Mobile depth profiling and sub-surface imaging techniques for historical paintings—A review
Energy Technology Data Exchange (ETDEWEB)
Alfeld, Matthias, E-mail: matthias.alfeld@desy.de [University of Hamburg, Department of Chemistry, Martin-Luther-King Platz 6, D-20146 Hamburg (Germany); University of Antwerp, Department of Chemistry, Groenenbrogerlaan 171, B-2020 Antwerp (Belgium); Broekaert, José A.C., E-mail: jose.broekaert@chemie.uni-hamburg.de [University of Hamburg, Department of Chemistry, Martin-Luther-King Platz 6, D-20146 Hamburg (Germany)
2013-10-01
Hidden, sub-surface paint layers and features contain valuable information for the art-historical investigation of a painting's past and for its conservation for coming generations. The number of techniques available for the study of these features has been considerably extended in the last decades and established techniques have been refined. This review focuses on mobile non-destructive subsurface imaging and depth profiling techniques, which allow for the in-situ investigation of easel paintings, i.e. paintings on a portable support. Among the techniques discussed are: X-ray radiography and infrared reflectography, which are long established methods and are in use for several decades. Their capabilities of element/species specific imaging have been extended by the introduction of energy/wavelength resolved measurements. Scanning macro-X-ray fluorescence analysis made it for the first time possible to acquire elemental distribution images in-situ and optical coherence tomography allows for the non-destructive study the surface paint layers in virtual cross-sections. These techniques and their variants are presented next to other techniques, such as Terahertz imaging, Nuclear Magnetic Resonance depth profiling and established techniques for non destructive testing (thermography, ultrasonic imaging and laser based interference methods) applied in the conservation of historical paintings. Next to selected case studies the capabilities and limitations of the techniques are discussed. - Highlights: • All mobile sub-surface and depth-profiling techniques for paintings are reviewed. • The number of techniques available has increased considerably in the last years. • X-ray radiography and infrared reflectography are still the most used techniques. • Scanning macro-XRF and optical coherence tomography begin to establish. • Industrial non destructive testing techniques support the preservation of paintings.
Compact Wideband and Low-Profile Antenna Mountable on Large Metallic Surfaces
DEFF Research Database (Denmark)
Zhang, Shuai; Pedersen, Gert F.
2017-01-01
This paper proposes a compact wideband and low-profile antenna mountable on large metallic surfaces. Six rows of coupled microstrip resonators with different lengths are printed on a Teflon block. The lengths of the microstrip resonators in different rows are gradually reduced along the end...... resonance at the lowest frequency. A trapezoid-shaped capacitive-feed (C-fed) strip is utilized and also printed on the Teflon block to globally optimize the wideband impedance matching. The proposed antenna covers a relative bandwidth of 109% for VSWR
Estimation of the p-wave velocity profile of elastic real data based on surface wave inversion
Ponomarenko, A.V.; Kashtan, B.M.; Troyan, V.N.; Mulder, W.A.
2013-01-01
Recently, we proposed an analytical approach to invert for a smoothly varying near-surface P-wave velocity profile that has a squared slowness linearly decreasing with depth. The exact solution for such a velocity profile in the acoustic approximation can be expressed in terms of Airy functions and
Schutte, Robert J; Parisi-Amon, Andreina; Reichert, W Monty
2009-01-01
Cytokines, chemokines, and growth factors were assayed from the supernatants of monocytes and macrophages cultured on common biomaterials with a range of surface chemistries. TNF-alpha, MCP-1, MIP-1alpha, IL-8, IL-6, IL-1beta, VEGF, IL-1ra, and IL-10 were measured from monocyte/macrophage cultures at different stages of activation and differentiation seeded onto polyethylene, polyurethane, expanded polytetrafluoroethylene, polymethyl methacrylate, and a hydrogel copolymer of 2-hydroxyethyl methacrylate, 1-vinyl-2-pyrrolidinone, and polyethylene glycol acrylate in tissue culture polystyrene (TCPS) plates. Empty TCPS wells and organo-tin polyvinyl chloride served as "blanks" and positive controls, respectively. Results showed an overall increase in cytokine, chemokine, and growth factor production as monocytes are activated or differentiated into macrophages and that proinflammatory and anti-wound healing cytokines and chemokines dominate this profile. However, cytokine production was only modestly affected by the surface chemistry of these four stable and noncytotoxic biomaterials. 2008 Wiley Periodicals, Inc.
Exploring the Plant–Microbe Interface by Profiling the Surface-Associated Proteins of Barley Grains
DEFF Research Database (Denmark)
Sultan, Abida; Andersen, Birgit; Svensson, Birte
2016-01-01
Cereal grains are colonized by a microbial community that actively interacts with the plant via secretion of various enzymes, hormones, and metabolites. Microorganisms decompose plant tissues by a collection of depolymerizing enzymes, including β-1,4-xylanases, that are in turn inhibited by plant...... xylanase inhibitors. To gain insight into the importance of the microbial consortia and their interaction with barley grains, we used a combined gel-based (2-DE coupled to MALDI-TOF-TOF MS) and gel-free (LC–MS/MS) proteomics approach complemented with enzyme activity assays to profile the surface......-associated proteins and xylanolytic activities of two barley cultivars. The surface-associated proteome was dominated by plant proteins with roles in defense and stress-responses, while the relatively less abundant microbial (bacterial and fungal) proteins were involved in cell-wall and polysaccharide degradation...
Directory of Open Access Journals (Sweden)
Akiyoshi Uezumi
2016-08-01
Full Text Available Skeletal muscle contains two distinct stem/progenitor populations. One is the satellite cell, which acts as a muscle stem cell, and the other is the mesenchymal progenitor, which contributes to muscle pathogeneses such as fat infiltration and fibrosis. Detailed and accurate characterization of these progenitors in humans remains elusive. Here, we performed comprehensive cell-surface protein profiling of the two progenitor populations residing in human skeletal muscle and identified three previously unrecognized markers: CD82 and CD318 for satellite cells and CD201 for mesenchymal progenitors. These markers distinguish myogenic and mesenchymal progenitors, and enable efficient isolation of the two types of progenitors. Functional study revealed that CD82 ensures expansion and preservation of myogenic progenitors by suppressing excessive differentiation, and CD201 signaling favors adipogenesis of mesenchymal progenitors. Thus, cell-surface proteins identified here are not only useful markers but also functionally important molecules, and provide valuable insight into human muscle biology and diseases.
Nokes, Mark A.; Flesher, Pamela; Borden, Peter; DeBusk, Damon K.; Lowell, John K.; Hill, Dale E.; Allen, Gary
1994-09-01
P-type wafers with oxygen concentration in two ranges near 30 ppma and 33 ppma, respectively, were processed through key thermal cycles. These processes were designed to denude the surface of oxygen, begin nucleation, and precipitate a portion of the oxygen in the bulk for intrinsic gettering. The samples were evaluated using nondestructive optical production profiling (OPP), and the results compared with surface photovoltage (SPV) measurements and cleave-and-etch inspection. The denuded zone depth (DZ) and bulk microdefect density (BMD) measured by OPP gave reasonable correlation with the diffusion lengths determined by SPV. The OPP data also showed the same general trends as the cleave- and-etch data. The shallower DZ and higher BMD reported by OPP in contrast to cleave-and-etch, however, are presumably due to the greater sensitivity of OPP to small defects.
Sheridan P. J.; Andrews, E.; Ogren, J A.; Tackett, J. L.; Winker, D. M.
2012-01-01
Between June 2006 and September 2009, an instrumented light aircraft measured over 400 vertical profiles of aerosol and trace gas properties over eastern and central Illinois. The primary objectives of this program were to (1) measure the in situ aerosol properties and determine their vertical and temporal variability and (2) relate these aircraft measurements to concurrent surface and satellite measurements. Underflights of the CALIPSO satellite show reasonable agreement in a majority of retrieved profiles between aircraft-measured extinction at 532 nm (adjusted to ambient relative humidity) and CALIPSO-retrieved extinction, and suggest that routine aircraft profiling programs can be used to better understand and validate satellite retrieval algorithms. CALIPSO tended to overestimate the aerosol extinction at this location in some boundary layer flight segments when scattered or broken clouds were present, which could be related to problems with CALIPSO cloud screening methods. The in situ aircraft-collected aerosol data suggest extinction thresholds for the likelihood of aerosol layers being detected by the CALIOP lidar. These statistical data offer guidance as to the likelihood of CALIPSO's ability to retrieve aerosol extinction at various locations around the globe.
Directory of Open Access Journals (Sweden)
Qinyuan Deng
2017-10-01
Full Text Available A maskless lithography method to realize the rapid and cost-effective fabrication of micro-optics elements with arbitrary surface profiles is reported. A digital micro-mirror device (DMD is applied to flexibly modulate that the exposure dose according to the surface profile of the structure to be fabricated. Due to the fact that not only the relationship between the grayscale levels of the DMD and the exposure dose on the surface of the photoresist, but also the dependence of the exposure depth on the exposure dose, deviate from a linear relationship arising from the DMD and photoresist, respectively, and cannot be systemically eliminated, complicated fabrication art and large fabrication error will results. A method of compensating the two nonlinear effects is proposed that can be used to accurately design the digital grayscale mask and ensure a precise control of the surface profile of the structure to be fabricated. To testify to the reliability of this approach, several typical array elements with a spherical surface, aspherical surface, and conic surface have been fabricated and tested. The root-mean-square (RMS between the test and design value of the surface height is about 0.1 μm. The proposed method of compensating the nonlinear effect in maskless lithography can be directly used to control the grayscale levels of the DMD for fabricating the structure with an arbitrary surface profile.
Mobile depth profiling and sub-surface imaging techniques for historical paintings—A review
Alfeld, Matthias; Broekaert, José A. C.
2013-10-01
Hidden, sub-surface paint layers and features contain valuable information for the art-historical investigation of a painting's past and for its conservation for coming generations. The number of techniques available for the study of these features has been considerably extended in the last decades and established techniques have been refined. This review focuses on mobile non-destructive subsurface imaging and depth profiling techniques, which allow for the in-situ investigation of easel paintings, i.e. paintings on a portable support. Among the techniques discussed are: X-ray radiography and infrared reflectography, which are long established methods and are in use for several decades. Their capabilities of element/species specific imaging have been extended by the introduction of energy/wavelength resolved measurements. Scanning macro-X-ray fluorescence analysis made it for the first time possible to acquire elemental distribution images in-situ and optical coherence tomography allows for the non-destructive study the surface paint layers in virtual cross-sections. These techniques and their variants are presented next to other techniques, such as Terahertz imaging, Nuclear Magnetic Resonance depth profiling and established techniques for non destructive testing (thermography, ultrasonic imaging and laser based interference methods) applied in the conservation of historical paintings. Next to selected case studies the capabilities and limitations of the techniques are discussed.
Ionic Liquid Coatings for Titanium Surfaces: Effect of IL Structure on Coating Profile.
Gindri, Izabelle M; Siddiqui, Danyal A; Frizzo, Clarissa P; Martins, Marcos A P; Rodrigues, Danieli C
2015-12-16
Dicationic imidazolium-based ionic liquids (ILs) having bis(trifluoromethylsulfonyl)imide (NTf2) and amino acid-based (methionine and phenylalanine) anionic moieties were synthesized and used to coat titanium surfaces using a dip-coating technique. Dicationic moieties with varying alkyl chains (8 and 10 carbons) and anions with distinct characteristics were selected to understand the influence of IL structural features on deposition profile. X-ray photoelectron spectroscopy (XPS) and atomic force microscopy (AFM) were used in this study to help elucidate intermolecular interactions within ILs as well as between ILs and TiO2 surfaces and to investigate IL coating morphology. Charge concentration on IL moieties, as well as the presence of functional groups that can interact via hydrogen bond, such as carboxylate and amino groups, were observed to influence the deposition profile. ILs containing amino acids as the anionic moiety were observed to interact strongly with TiO2, which resulted in more pronounced changes in Ti 2p binding energy. The higher hydrophobicity of the IL having NTf2 as the anionic moiety resulted in higher adhesion strength between the IL coating and TiO2.
Validation of Surface Skin Temperature and Moisture Profiles Using Satellite Data
Wu, Man Li C.; Schubert, Siegfried; Lin, Ching I.
1999-01-01
New validation techniques and metrics using satellite data have been developed to evaluate the quality of model-based estimates of surface skin temperature (Tg) and moisture profiles (q). The satellite data consist of clear sky outgoing long-wave radiation (CLR), broadband radiances from 8 to 12 mu (RadWn), brightness temperature centered around 10.8 mu (Tbb), and total precipitable water (TPW) from microwave radiometry. We show that CLR can be used to diagnose Tg. Furthermore, by using a combination of CLR and RadWn from CERES-TRMM measurements and TPW from SSM/I, we are able to identify errors in the moisture profile. Finally, three-hourly Tbb from the International Satellite Cloud Climatology Project can be used to evaluate the amplitude and diurnal variation of Tg. For purpose of illustration, Tg and q are evaluated from runs with an early version of the Goddard Earth Observing System Data Assimilation System (GEOS-2). It is found that, in general, Tg is too cold in the winter hemisphere and q is too wet in the upper atmosphere. In order to address these deficiencies, several improvements have been implemented into GEOS-2, including a Land-Surface-Model, a Moist Turbulence Scheme, and the assimilation of new TOVS retrievals. Preliminary results indicate positive impacts from each of these implementations.
Non-contact automatic measurement of free-form surface profiles on CNC machines
Fan, Kuang-Chao; Wen, Kuang-Pu
1993-09-01
This paper describes the work to develop a non-contact type automatic measurement system for any free-form surfaces on a CNC machine tool or a coordinate measuring machine (CMM) and its CAD/CAM integration. A laser probe made by Keyence Co. model LC-2220 was integrated into the CNC machine as the non-contact sensor. A measurement software has been developed for automatic surface tracing of any free-form profile. Data transfer to any commercially available CAD/CAM system for reverse engineering is also available via proper DXF file. Extensive calibration work has been carried out on the systematic accuracy of the laser probe with respect to the color material surface slope and edge detection of the workpiece by the use of a HP5528 laser interferometer system. Having employed the surface painting technique the shape error of the copied object relative to its master piece was found within 30 micrometers which is deemed adequate enough to the mold industry.
OCT-based profiler for automating ocular surface prosthetic fitting (Conference Presentation)
Mujat, Mircea; Patel, Ankit H.; Maguluri, Gopi N.; Iftimia, Nicusor V.; Patel, Chirag; Agranat, Josh; Tomashevskaya, Olga; Bonte, Eugene; Ferguson, R. Daniel
2016-03-01
The use of a Prosthetic Replacement of the Ocular Surface Environment (PROSE) device is a revolutionary treatment for military patients that have lost their eyelids due to 3rd degree facial burns and for civilians who suffer from a host of corneal diseases. However, custom manual fitting is often a protracted painful, inexact process that requires multiple fitting sessions. Training for new practitioners is a long process. Automated methods to measure the complete corneal and scleral topology would provide a valuable tool for both clinicians and PROSE device manufacturers and would help streamline the fitting process. PSI has developed an ocular anterior-segment profiler based on Optical Coherence Tomography (OCT), which provides a 3D measure of the surface of the sclera and cornea. This device will provide topography data that will be used to expedite and improve the fabrication process for PROSE devices. OCT has been used to image portions of the cornea and sclera and to measure surface topology for smaller contact lenses [1-3]. However, current state-of-the-art anterior eye OCT systems can only scan about 16 mm of the eye's anterior surface, which is not sufficient for covering the sclera around the cornea. In addition, there is no systematic method for scanning and aligning/stitching the full scleral/corneal surface and commercial segmentation software is not optimized for the PROSE application. Although preliminary, our results demonstrate the capability of PSI's approach to generate accurate surface plots over relatively large areas of the eye, which is not currently possible with any other existing platform. Testing the technology on human volunteers is currently underway at Boston Foundation for Sight.
Wei, Yi; Gadaria-Rathod, Neha; Epstein, Seth; Asbell, Penny
2013-01-01
Purpose. To provide standard operating procedures (SOPs) for measuring tear inflammatory cytokine concentrations and to validate the resulting profile as a minimally invasive objective metric and biomarker of ocular surface inflammation for use in multicenter clinical trials on dry eye disease (DED). Methods. Standard operating procedures were established and then validated with cytokine standards, quality controls, and masked tear samples collected from local and distant clinical sites. The concentrations of the inflammatory cytokines in tears were quantified using a high-sensitivity human cytokine multiplex kit. Results. A panel of inflammatory cytokines was initially investigated, from which four key inflammatory cytokines (IL-1β, IL-6, INF-γ, and TNF-α) were chosen. Results with cytokine standards statistically satisfied the manufacturer's quality control criteria. Results with pooled tear samples were highly reproducible and reliable with tear volumes ranging from 4 to 10 μL. Incorporation of the SOPs into clinical trials was subsequently validated. Tear samples were collected at a distant clinical site, stored, and shipped to our Biomarker Laboratory, where a masked analysis of the four tear cytokines was successfully performed. Tear samples were also collected from a feasibility study on DED. Inflammatory cytokine concentrations were decreased in tears of subjects who received anti-inflammatory treatment. Conclusions. Standard operating procedures for human tear cytokine assessment suitable for multicenter clinical trials were established. Tear cytokine profiling using these SOPs may provide objective metrics useful for diagnosing, classifying, and analyzing treatment efficacy in inflammatory conditions of the ocular surface, which may further elucidate the mechanisms involved in the pathogenesis of ocular surface disease. PMID:24204044
DEFF Research Database (Denmark)
Chen, Jin; Cheng, Jiangtao; Shen, Wenzhong
2013-01-01
Aerodynamic of airfoil performance is closely related to the continuity of its surface curvature, and airfoil profiles with a better aerodynamic performance plays an important role in the design of wind turbine. The surface curvature distribution along the chord direction and pressure distributio...
Arcinas, Arthur; Yen, Ten-Yang; Kebebew, Electron; Macher, Bruce A.
2009-01-01
Cell surface proteins have been shown to be effective therapeutic targets. In addition, shed forms of these proteins and secreted proteins can serve as biomarkers for diseases, including cancer. Thus, identification of cell surface and secreted proteins has been a prime area of interest in the proteomics field. Most cell surface and secreted proteins are known to be glycosylated and therefore, a proteomics strategy targeting these proteins was applied to obtain proteomic profiles from various thyroid cancer cell lines that represent the range of thyroid cancers of follicular cell origin. In this study, we oxidized the carbohydrates of secreted proteins and those on the cell surface with periodate and isolated them via covalent coupling to hydrazide resin. The glycoproteins obtained were identified from tryptic peptides and N-linked glycopeptides released from the hydrazide resin using 2-dimensional liquid chromatography-tandem mass spectrometry in combination with the gas phase fractionation. Thyroid cancer cell lines derived from papillary thyroid cancer (TPC-1), follicular thyroid cancer (FTC-133), Hürthle cell carcinoma (XTC-1), and anaplastic thyroid cancer (ARO and DRO-1) were evaluated. An average of 150 glycoproteins were identified per cell line, of which more than 57 percent are known cell surface or secreted glycoproteins. The usefulness of the approach for identifying thyroid cancer associated biomarkers was validated by the identification of glycoproteins (e.g. CD44, galectin 3 and metalloproteinase inhibitor 1) that have been found to be useful markers for thyroid cancer. In addition to glycoproteins that are commonly expressed by all of the cell lines, we identified others that are only expressed in the more well-differentiated thyroid cancer cell lines (follicular, Hürthle cell and papillary), or by cell lines derived from undifferentiated tumors that are uniformly fatal forms of thyroid cancer (i.e. anaplastic). Based on the results obtained, a
Development of Pseudorandom Binary Arrays for Calibration of Surface Profile Metrology Tools
Energy Technology Data Exchange (ETDEWEB)
Barber, S.K.; Takacs, P.; Soldate, P.; Anderson, E.H.; Cambie, R.; McKinney, W.R.; Voronov, D.L.; Yashchuk, V.V.
2009-12-01
Optical metrology tools, especially for short wavelengths (extreme ultraviolet and x-ray), must cover a wide range of spatial frequencies from the very low, which affects figure, to the important mid-spatial frequencies and the high spatial frequency range, which produces undesirable scattering. A major difficulty in using surface profilometers arises due to the unknown point-spread function (PSF) of the instruments [G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE, Bellingham, WA, 2001)] that is responsible for distortion of the measured surface profile. Generally, the distortion due to the PSF is difficult to account for because the PSF is a complex function that comes to the measurement via the convolution operation, while the measured profile is described with a real function. Accounting for instrumental PSF becomes significantly simpler if the result of measurement of a profile is presented in the spatial frequency domain as a power spectral density (PSD) distribution [J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company, Englewood, CO, 2005)]. For example, measured PSD distributions provide a closed set of data necessary for three-dimensional calculations of scattering of light by the optical surfaces [E. L. Church et al., Opt. Eng. (Bellingham) 18, 125 (1979); J. C. Stover, Optical Scattering, 2nd ed. (SPIE Optical Engineering Press, Bellingham, WA, 1995)]. The distortion of the surface PSD distribution due to the PSF can be modeled with the modulation transfer function (MTF), which is defined over the spatial frequency bandwidth of the instrument. The measured PSD distribution can be presented as a product of the squared MTF and the ideal PSD distribution inherent for the system under test. Therefore, the instrumental MTF can be evaluated by comparing a measured PSD distribution of a known test surface with the corresponding ideal numerically simulated PSD. The square root of the ratio of the
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Exploring the Plant-Microbe Interface by Profiling the Surface-Associated Proteins of Barley Grains.
Sultan, Abida; Andersen, Birgit; Svensson, Birte; Finnie, Christine
2016-04-01
Cereal grains are colonized by a microbial community that actively interacts with the plant via secretion of various enzymes, hormones, and metabolites. Microorganisms decompose plant tissues by a collection of depolymerizing enzymes, including β-1,4-xylanases, that are in turn inhibited by plant xylanase inhibitors. To gain insight into the importance of the microbial consortia and their interaction with barley grains, we used a combined gel-based (2-DE coupled to MALDI-TOF-TOF MS) and gel-free (LC-MS/MS) proteomics approach complemented with enzyme activity assays to profile the surface-associated proteins and xylanolytic activities of two barley cultivars. The surface-associated proteome was dominated by plant proteins with roles in defense and stress-responses, while the relatively less abundant microbial (bacterial and fungal) proteins were involved in cell-wall and polysaccharide degradation and included xylanases. The surface-associated proteomes showed elevated xylanolytic activity and contained several xylanases. Integration of proteomics with enzyme assays is a powerful tool for analysis and characterization of the interaction between microbial consortia and plants in their natural environment.
Augustin, C. M.
2015-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a
Two-layer Hall-effect model with arbitrary surface-donor profiles: application to ZnO
Look, D. C.
2008-09-01
A complete two-layer Hall-effect model, allowing arbitrary donor and acceptor profiles, is presented and applied to the problem of conductive surface layers in ZnO. Temperature-dependent mobility and carrier concentration data in the temperature range of 20-320 K are fitted with an efficient algorithm easily implemented in commercial mathematics programs such as MATHCAD. The model is applied to two ZnO samples, grown by the melt (MLT) and hydrothermal (HYD) processes, respectively. Under the assumption of a "square" surface-donor profile, the fitted surface-layer thicknesses are 48 and 2.5 nm, respectively, for the MLT and HYD samples. The surface-donor concentrations are 7.6×1017 and 8.3×1018 cm-3, and the integrated surface-donor concentrations are 2.1×1012 and 3.6×1012 cm-2. For an assumed Gaussian [NDs(0)exp(-z2/ds2)] donor profile, the fitted values of ds are nearly the same as those for the square profile. The values of ND ,s(0) are about 50% larger and the integrated donor-concentration values are about 15% larger, for both samples. As a surface-analysis tool, the Hall effect is extremely sensitive and applicable over a wide range of surface-layer conditions.
White, Raymond E., III
1998-01-01
This final report uses ROSAT observations to analyze two different studies. These studies are: Analysis of Mass Profiles and Cooling Flows of Bright, Early-Type Galaxies; and Surface Brightness Profiles and Energetics of Intracluster Gas in Cool Galaxy Clusters.
Cure, David; Weller, Thomas; Miranda, Felix A.; Herzig, Paul
2011-01-01
In this paper, the impact of adding discrete capacitive loading along one dimension of a frequency selective surface for low profile antenna applications is presented for the first time. The measured data demonstrates comparable performance between a non-loaded and a capacitively-loaded FSS with a significant reduction in the number of cells and/or cell geometry size. Additionally, the provision of discrete capacitive loads reduces the FSS susceptibility to fabrication tolerances based on placement of a fixed grid capacitance. The bandwidth increased from 1.8% to 7.3% for a total antenna thickness of approx. lambda/22, and from 1.5% to 9.2% for a thickness of approx. lambda/40. The total antenna area for each case was reduced by 55% and 12%, respectively.
Homogenization of seismic surface wave profiling in highly heterogeneous improved ground
Lin, C.; Chien, C.
2012-12-01
Seismic surface wave profiling is gaining popularity in engineering practice for determining shear-wave velocity profile since the two-station SASW (Spectral Analysis of Surface Wave) was introduced. Recent developments in the multi-station approach (Multi-station Analysis of Surface Wave, MASW) result in several convenient commercial tools. Unlike other geophysical tomography methods, the surface wave method is essentially a 1-D method assuming horizontally-layered medium. Nevertheless, MASW is increasingly used to map lateral variation of S-wave velocity by multiple surveys overlooking the effect of lateral heterogeneity. MASW typically requires long receiver spread in order to have enough depth coverage. The accuracy and lateral resolution of 2-D S-wave velocity imaging by surface wave is not clear. Many geotechnical applications involves lateral variation in a scale smaller than the geophone spread and wave length. For example, soft ground is often improved to increase strength and stiffness by methods such as jet grouting and stone column which result in heterogeneous ground with improved columns. Experimental methods (Standard Penetration Test, sampling and laboratory testing, etc.) used to assess such ground improvement are subjected to several limitations such as small sampling volume, time-consuming, and cost ineffectiveness. It's difficult to assess the average property of the improved ground and the actual replacement ratio of ground improvement. The use of seismic surface wave method for such a purpose seems to be a good alternative. But what MASW measures in such highly heterogeneous improved ground remains to be investigated. This study evaluated the feasibility of MASW in highly heterogeneous ground with improved columns and investigated the homogenization of shear wave velocity measured by MASW. Field experiments show that MASW testing in such a composite ground behaves similar to testing in horizontally layered medium. It seems to measure some sort
Stauffer, Ryan M.; Thompson, Anne M.; Oltmans, Samual J.; Johnson, Bryan J.
2017-01-01
Much attention has been focused on the transport of ozone (O3) to the western U.S., particularly given the latest revision of the National Ambient Air Quality Standard to 70 parts per billion by volume (ppbv) of O3. This makes quantifying the contributions of stratosphere-to-troposphere exchange, local pollution, and pollution transport to this region essential. To evaluate free-tropospheric and surface O3 in the western U.S., we use self-organizing maps to cluster 18 years of ozonesonde profiles from Trinidad Head, CA. Three of nine O3 mixing ratio profile clusters exhibit thin laminae of high O3 above Trinidad Head. The high O3 layers are located between 1 and 6 km above mean sea level and reside above an inversion associated with a northern location of the Pacific subtropical high. Ancillary data (reanalyses, trajectories, and remotely sensed carbon monoxide) help identify the high O3 sources in one cluster, but distinguishing mixed influences on the elevated O3 in other clusters is difficult. Correlations between the elevated tropospheric O3 and surface O3 at high-altitude monitors at Lassen Volcanic and Yosemite National Parks, and Truckee, CA, are marked and long lasting. The temporal correlations likely result from a combination of transport of baseline O3 and covarying meteorological parameters. Days corresponding to the high O3 clusters exhibit hourly surface O3 anomalies of +5-10 ppbv compared to a climatology; the positive anomalies can last up to 3 days after the ozonesonde profile. The profile and surface O3 links demonstrate the importance of regular ozonesonde profiling at Trinidad Head.
Stauffer, Ryan M.; Thompson, Anne M.; Oltmans, Samuel J.; Johnson, Bryan J.
2017-01-01
Much attention has been focused on the transport of ozone (O3) to the western U.S., particularly given the latest revision of the National Ambient Air Quality Standard to 70 parts per billion by volume (ppbv) of O3. This makes quantifying the contributions of stratosphere-to-troposphere exchange, local pollution, and pollution transport to this region essential. To evaluate free-tropospheric and surface O3 in the western U.S., we use self-organizing maps to cluster 18 years of ozonesonde profiles from Trinidad Head, CA. Three of nine O3 mixing ratio profile clusters exhibit thin laminae of high O3 above Trinidad Head. The high O3 layers are located between 1 and 6 km above mean sea level and reside above an inversion associated with a northern location of the Pacific subtropical high. Ancillary data (reanalyses, trajectories, and remotely sensed carbon monoxide) help identify the high O3 sources in one cluster, but distinguishing mixed influences on the elevated O3 in other clusters is difficult. Correlations between the elevated tropospheric O3 and surface O3 at high-altitude monitors at Lassen Volcanic and Yosemite National Parks, and Truckee, CA, are marked and long lasting. The temporal correlations likely result from a combination of transport of baseline O3 and covarying meteorological parameters. Days corresponding to the high O3 clusters exhibit hourly surface O3 anomalies of +5-10 ppbv compared to a climatology; the positive anomalies can last up to 3 days after the ozonesonde profile. The profile and surface O3 links demonstrate the importance of regular ozonesonde profiling at Trinidad Head.
Directory of Open Access Journals (Sweden)
A. Cherkasheva
2013-04-01
Full Text Available Current estimates of global marine primary production range over a factor of two. Improving these estimates requires an accurate knowledge of the chlorophyll vertical profiles, since they are the basis for most primary production models. At high latitudes, the uncertainty in primary production estimates is larger than globally, because here phytoplankton absorption shows specific characteristics due to the low-light adaptation, and in situ data and ocean colour observations are scarce. To date, studies describing the typical chlorophyll profile based on the chlorophyll in the surface layer have not included the Arctic region, or, if it was included, the dependence of the profile shape on surface concentration was neglected. The goal of our study was to derive and describe the typical Greenland Sea chlorophyll profiles, categorized according to the chlorophyll concentration in the surface layer and further monthly resolved profiles. The Greenland Sea was chosen because it is known to be one of the most productive regions of the Arctic and is among the regions in the Arctic where most chlorophyll field data are available. Our database contained 1199 chlorophyll profiles from R/Vs Polarstern and Maria S. Merian cruises combined with data from the ARCSS-PP database (Arctic primary production in situ database for the years 1957–2010. The profiles were categorized according to their mean concentration in the surface layer, and then monthly median profiles within each category were calculated. The category with the surface layer chlorophyll (CHL exceeding 0.7 mg C m−3 showed values gradually decreasing from April to August. A similar seasonal pattern was observed when monthly profiles were averaged over all the surface CHL concentrations. The maxima of all chlorophyll profiles moved from the greater depths to the surface from spring to late summer respectively. The profiles with the smallest surface values always showed a subsurface chlorophyll
Hauschild, Dirk
2017-02-01
Today, the use of laser photons for materials processing is a key technology in nearly all industries. Most of the applications use circular beam shapes with Gaussian intensity distribution that is given by the resonator of the laser or by the power delivery via optical fibre. These beam shapes can be typically used for material removal with cutting or drilling and for selective removal of material layers with ablation processes. In addition to the removal of materials, it is possible to modify and improve the material properties in case the dose of laser photons and the resulting light-material interaction addresses a defined window of energy and dwell-time. These process windows have typically dwell-times between µs and s because of using sintering, melting, thermal diffusion or photon induced chemical and physical reaction mechanisms. Using beam shaping technologies the laser beam profiles can be adapted to the material properties and time-temperature and the space-temperature envelopes can be modified to enable selective annealing or crystallization of layers or surfaces. Especially the control of the process energy inside the beam and at its edges opens a large area of laser applications that can be addressed only with an optimized spatial and angular beam profile with down to sub-percent intensity variation used in e.g. immersion lithography tools with ArF laser sources. LIMO will present examples for new beam shapes and related material refinement processes even on large surfaces and give an overview about new mechanisms in laser material processing for current and coming industrial applications.
Sapriza-Azuri, G.; Gamazo, P. A.; Razavi, S.; Wheater, H. S.
2016-12-01
Earth system models are essential for the evaluation of the impact of climate change. At global and regional scales, General Circulation Models (GCM) and Regional Climate Models (RCM) are used to simulate climate change evolution. Hydrological Land Surface Models (HLSM) are used along with GCMs and RCMs (coupled or offline) to have a better representation of the hydrological cycle. All these models typically have a common implementation of the energy and water balance in the soil, known as the Land Surface Model (LSM). In general, a standard soil configuration with a depth of no more than 4 meters is used in all LSMs that are commonly implemented in GCMs, RCMs and HLSMs. For moderate climate conditions, this depth is sufficient to capture the intra-annual variability in the energy and water balance. However, for cold regions and for long-term simulations, deeper subsurface layers are needed in order to allow the heat signal to propagate through the soil to deeper layers and hence to avoid erroneous near-surface states and fluxes. Deeper soil/rock configurations create longer system memories, and as such, particular care should be taken to define the initial conditions for the subsurface system. In this work we perform a sensitivity analysis of the main factors that affect the subsurface energy and water balance for LSMs in cold regions - depth of soil, soil parameters, initial conditions and climate conditions for a warm-up period. We implement a 1D model using the Canadian Land Surface Scheme (CLASS) LSM for a study area in northern Canada where measurements of soil temperature profiles are available. Results suggest that an adequate representation of the heat propagation process in the soil requires the simulation of a soil depth of greater than 25 meters. As for initial conditions we recommend to spin-up over a cycle of an average climate year and then use reconstructed climate time series with a length of more than 300 years.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Energy Technology Data Exchange (ETDEWEB)
Cascant, M.; Morecroft, D.; Boulif, K.; Vauche, L.; Yuste, H.; Castano, F.J. [Siliken, High efficiency solar cell pilot line, R and D department, Ciudad Politecnica de la Innovacion- UPV Camino de Vera 14, 46022 Valencia, (Spain); Bende, E.E. [ECN Solar Energy, Petten (Netherlands)
2012-09-15
Optimization of the Front Surface Field (FSF) for IBC cells is important for passivation, lowering series resistance and reducing UV light degradation. This work presents results for optimizing the FSF diffusion from an industrial perspective, focusing on optimizing the process flow to achieve excellent FSF performance, whilst at the same time reducing the number of process steps. The ideal FSF profile is a compromise since a lightly doped deep diffusion reduces recombination losses close the cell surface where the light is captured, whilst increased doping reduces series resistance. This work investigates diffusing the FSF (1) at the beginning, (2) in the middle and (3) towards the end of the IBC process flow. The advantage of the first option is that the diffusion depth can be increased by subsequent thermal steps. However a diffusion barrier is required to protect the FSF throughout the subsequent processing, which increases the number of process steps and results in increased costs. By placing the FSF diffusion later in the process flow it is possible to simplify the process reducing the number of steps. Experimental results show excellent FSF diffusion passivation performance over 156mm, with lifetime values of over 500 {mu}s. Simulations confirm that high current generation can be achieved with a short circuit current of over 40 mA cm-{sup 2}.
Energy Technology Data Exchange (ETDEWEB)
Pu, Zhaoxia [Univ. of Utah, Salt Lake City, UT (United States)
2015-10-06
Most routine measurements from climate study facilities, such as the Department of Energy’s ARM SGP site, come from individual sites over a long period of time. While single-station data are very useful for many studies, it is challenging to obtain 3-dimensional spatial structures of atmospheric boundary layers that include prominent signatures of deep convection from these data. The principal objective of this project is to create realistic estimates of high-resolution (~ 1km × 1km horizontal grids) atmospheric boundary layer structure and the characteristics of precipitating convection. These characteristics include updraft and downdraft cumulus mass fluxes and cold pool properties over a region the size of a GCM grid column from analyses that assimilate surface mesonet observations of wind, temperature, and water vapor mixing ratio and available profiling data from single or multiple surface stations. The ultimate goal of the project is to enhance our understanding of the properties of mesoscale convective systems and also to improve their representation in analysis and numerical simulations. During the proposed period (09/15/2011–09/14/2014) and the no-cost extension period (09/15/2014–09/14/2015), significant accomplishments have been achieved relating to the stated goals. Efforts have been extended to various research and applications. Results have been published in professional journals and presented in related science team meetings and conferences. These are summarized in the report.
Directory of Open Access Journals (Sweden)
O. V. Karmanova
2014-01-01
Full Text Available Influence the degree of dispersion of the carbon black on the rheological characteristics of the surface appearance and rubber mixtures based on ethylene-propylene rubber EPDM-50 was investigated. Effect of mixing time on the degree of dispersion of the carbon black elastic-viscous and extrusion characteristics of rubber compounds were found. Component tangent of the angle of mechanical losses tgδ to evaluate the rheological and technological properties of the rubber compounds used. Relationship changes tgδ valuesand properties of rubber compounds in the preparation of the compositions of rubber with carbon black was shown. On the curves of the length of the mixing tgδ rubber filler identified three main areas of change in the rheological and techno-logical properties of rubber compounds. This allows you to monitor and make adjustments to the mode of preparation of the compositions in the real world of production. evaluation of the quality of mixing in surface appearance characteristics unshaped profiles was conducted. The resulting patterns formed the basis for the development of recommendations for the selection of optimal blending modes in the production and quality control of production of rubber compounds.
Wideband, Low-Profile, Dual-Polarized Slot Antenna with an AMC Surface for Wireless Communications
Directory of Open Access Journals (Sweden)
Wei Hu
2016-01-01
Full Text Available A wideband dual-polarized slot antenna loaded with artificial magnetic conductor (AMC is proposed for WLAN/WIMAX and LTE applications. The slot antenna mainly consists of two pairs of arrow-shaped slots along the diagonals of the square patch. Stepped microstrip feedlines are placed orthogonally to excite the horizontal and vertical polarizations of the antenna. To realize unidirectional radiation and low profile, an AMC surface composed of 7 × 7 unit cells is designed underneath a distance of 0.09λ0 (λ0 being the wavelength in free space at 2.25 GHz from the slot antenna. Both the dual-polarized slot antenna and the AMC surface are fabricated and measured. Experimental results demonstrate that the proposed antenna achieves for both polarizations a wide impedance bandwidth (return loss 10 dB of 36.7%, operating from 1.96 to 2.84 GHz. The isolation between the two input ports keeps higher than 29 dB whereas the cross-polarization levels basically maintain lower than −30 dB across the entire frequency band. High front-to-back ratios better than 22 dB and a stable gain higher than 8 dBi are obtained over the whole band.
Diethert, Alexander; Ecker, Katharina; Peykova, Yana; Willenbacher, Norbert; Müller-Buschbaum, Peter
2011-06-01
We present a possibility of tailoring the near-surface composition profiles of pressure sensitive adhesive (PSA) films by an exposure to atmospheres of different relative humidities (RHs). The statistical copolymer P(EHA-stat-20MMA) with a majority of ethylhexylacrylate (EHA) and a minority of methylmethacrylate (MMA), being cast from a toluene based solution, is chosen as a model system. The near-surface composition profile is probed with X-ray reflectivity. All probed samples show an enrichment of PMMA at the sample surface; however, the near-surface PMMA content strongly increases with increasing RH. The influence of the RH on the composition profile is present down to a depth of 50 nm. Therefore the surface tensions being derived from contact angle measurements do not show any measurable humidity dependence. In contrast, in a mechanical tack test with a smooth punch surface, a strong influence is probed. This observation can be explained by considering the integrated PMMA content over an appropriate near-surface region and the resulting impact on the cavitation process. © 2011 American Chemical Society
GLOBAL PROPERTIES OF M31'S STELLAR HALO FROM THE SPLASH SURVEY. I. SURFACE BRIGHTNESS PROFILE
Energy Technology Data Exchange (ETDEWEB)
Gilbert, Karoline M. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195-1580 (United States); Guhathakurta, Puragra [UCO/Lick Observatory, Department of Astronomy and Astrophysics, University of California Santa Cruz, 1156 High Street, Santa Cruz, CA 95064 (United States); Beaton, Rachael L.; Majewski, Steven R.; Ostheimer, James C.; Patterson, Richard J. [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904-4325 (United States); Bullock, James; Tollerud, Erik J. [Center for Cosmology, Department of Physics and Astronomy, University of California at Irvine, Irvine, CA 92697 (United States); Geha, Marla C. [Astronomy Department, Yale University, New Haven, CT 06520 (United States); Kalirai, Jason S. [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Kirby, Evan N. [California Institute of Technology, 1200 East California Boulevard, MC 249-17, Pasadena, CA 91125 (United States); Tanaka, Mikito; Chiba, Masashi, E-mail: kgilbert@astro.washington.edu [Astronomical Institute, Tohoku University, Aoba-ku, Sendai 980-8578 (Japan)
2012-11-20
We present the surface brightness profile of M31's stellar halo out to a projected radius of 175 kpc. The surface brightness estimates are based on confirmed samples of M31 red giant branch stars derived from Keck/DEIMOS spectroscopic observations. A set of empirical spectroscopic and photometric M31 membership diagnostics is used to identify and reject foreground and background contaminants. This enables us to trace the stellar halo of M31 to larger projected distances and fainter surface brightnesses than previous photometric studies. The surface brightness profile of M31's halo follows a power law with index -2.2 {+-} 0.2 and extends to a projected distance of at least {approx}175 kpc ({approx}2/3 of M31's virial radius), with no evidence of a downward break at large radii. The best-fit elliptical isophotes have b/a = 0.94 with the major axis of the halo aligned along the minor axis of M31's disk, consistent with a prolate halo, although the data are also consistent with M31's halo having spherical symmetry. The fact that tidal debris features are kinematically cold is used to identify substructure in the spectroscopic fields out to projected radii of 90 kpc and investigate the effect of this substructure on the surface brightness profile. The scatter in the surface brightness profile is reduced when kinematically identified tidal debris features in M31 are statistically subtracted; the remaining profile indicates that a comparatively diffuse stellar component to M31's stellar halo exists to large distances. Beyond 90 kpc, kinematically cold tidal debris features cannot be identified due to small number statistics; nevertheless, the significant field-to-field variation in surface brightness beyond 90 kpc suggests that the outermost region of M31's halo is also comprised to a significant degree of stars stripped from accreted objects.
Choice of satellite-based CO2 product (XCO¬2, vertical profile) alters surface CO2 flux estimate
Liu, J.; Bowman, K. W.; Lee, M.; Henze, D. K.; Fisher, J. B.; Frankenberg, C.; Polhamus, A.
2011-12-01
The ACOS (Atmospheric CO2 Observations from Space) algorithm provides column-averaged CO2 products in units of dry-air mole fraction (XCO2) based on GOSAT radiances. However, XCO2 is derived from a linear transformation of the CO2 vertical profiles estimated from the ACOS retrieval algorithm. In theory, XCO2 vertical columns should provide no more information than the original CO2 profiles. However, the different sensitivities of either CO2 profiles or XCO2 to transport errors can significantly alter surface CO2 flux estimates. Though it has been argued that XCO2 may be less sensitive to transport error than CO2 vertical profiles, there is no study so far investigating the actual impact on surface CO2 flux estimation due to the choice of observation format, which could have significant impact on future satellite CO2 profile mission concepts. In this presentation, we will present the sensitivity of surface CO2 flux estimation to a suite of CO2 observation products, which includes CO2 vertical profiles, XCO2, and the lowest 3 levels of CO2 from CO2 vertical profiles. The CO2 observations are ACOS products covering from July 2009 to June 2010. We will present both OSSE and real observation experiments. In the OSSE experiments, we will present both perfect model experiments and experiments with model errors that are introduced by changing the planetary boundary height. In the real observations, we will show the annual and seasonal CO2 flux as function of regions from using the three observation products. The accuracy of CO2 flux estimation will be examined by comparing CO2 concentrations forced by posterior CO2 flux to independent CO2 observations. The surface CO2 flux estimation framework is based on GEOS-Chem adjoint model that is developed by the Carbon Monitoring Study flux pilot project.
Dangaria, Smit J.
2011-12-01
Stem/progenitor cells are a population of cells capable of providing replacement cells for a given differentiated cell type. We have applied progenitor cell-based technologies to generate novel tissue-engineered implants that use biomimetic strategies with the ultimate goal of achieving full regeneration of lost periodontal tissues. Mesenchymal periodontal tissues such as cementum, alveolar bone (AB), and periodontal ligament (PDL) are neural crest-derived entities that emerge from the dental follicle (DF) at the onset of tooth root formation. Using a systems biology approach we have identified key differences between these periodontal progenitors on the basis of global gene expression profiles, gene cohort expression levels, and epigenetic modifications, in addition to differences in cellular morphologies. On an epigenetic level, DF progenitors featured high levels of the euchromatin marker H3K4me3, whereas PDL cells, AB osteoblasts, and cementoblasts contained high levels of the transcriptional repressor H3K9me3. Secondly, we have tested the influence of natural extracellular hydroxyapatite matrices on periodontal progenitor differentiation. Dimension and structure of extracellular matrix surfaces have powerful influences on cell shape, adhesion, and gene expression. Here we show that natural tooth root topographies induce integrin-mediated extracellular matrix signaling cascades in tandem with cell elongation and polarization to generate physiological periodontium-like tissues. In this study we replanted surface topography instructed periodontal ligament progenitors (PDLPs) into rat alveolar bone sockets for 8 and 16 weeks, resulting in complete attachment of tooth roots to the surrounding alveolar bone with a periodontal ligament fiber apparatus closely matching physiological controls along the entire root surface. Displacement studies and biochemical analyses confirmed that progenitor-based engineered periodontal tissues were similar to control teeth and
Lievens, Hans; Vernieuwe, Hilde; Alvarez-Mozos, Jesús; De Baets, Bernard; Verhoest, Niko E C
2009-01-01
In the past decades, many studies on soil moisture retrieval from SAR demonstrated a poor correlation between the top layer soil moisture content and observed backscatter coefficients, which mainly has been attributed to difficulties involved in the parameterization of surface roughness. The present paper describes a theoretical study, performed on synthetical surface profiles, which investigates how errors on roughness parameters are introduced by standard measurement techniques, and how they will propagate through the commonly used Integral Equation Model (IEM) into a corresponding soil moisture retrieval error for some of the currently most used SAR configurations. Key aspects influencing the error on the roughness parameterization and consequently on soil moisture retrieval are: the length of the surface profile, the number of profile measurements, the horizontal and vertical accuracy of profile measurements and the removal of trends along profiles. Moreover, it is found that soil moisture retrieval with C-band configuration generally is less sensitive to inaccuracies in roughness parameterization than retrieval with L-band configuration.
Directory of Open Access Journals (Sweden)
Bernard De Baets
2009-02-01
Full Text Available In the past decades, many studies on soil moisture retrieval from SAR demonstrated a poor correlation between the top layer soil moisture content and observed backscatter coefficients, which mainly has been attributed to difficulties involved in the parameterization of surface roughness. The present paper describes a theoretical study, performed on synthetical surface profiles, which investigates how errors on roughness parameters are introduced by standard measurement techniques, and how they will propagate through the commonly used Integral Equation Model (IEM into a corresponding soil moisture retrieval error for some of the currently most used SAR configurations. Key aspects influencing the error on the roughness parameterization and consequently on soil moisture retrieval are: the length of the surface profile, the number of profile measurements, the horizontal and vertical accuracy of profile measurements and the removal of trends along profiles. Moreover, it is found that soil moisture retrieval with C-band configuration generally is less sensitive to inaccuracies in roughness parameterization than retrieval with L-band configuration.
Chambers, J. E.; Cassen, P.
2002-01-01
We present 32 N-body simulations of planetary accretion in the inner Solar System, examining the effect of nebula surface density profile and initial eccentricities of Jupiter and Saturn on the compositions and orbits of the inner planets. Additional information is contained in the original extended abstract.
Food allergens profiling with an imaging surface plasmon resonance-based biosensor.
Rebe Raz, Sabina; Liu, Hong; Norde, Willem; Bremer, Maria G E G
2010-10-15
Food allergy is a growing health concern, which currently affects approximately 4% of adults and 8% of infants. For consumer protection purposes, food producers are required by law to disclose on the product label whether a major allergen is used during the production process. The commonly employed monitoring methods are highly laborious, time-consuming, and often expensive when screening for multiple allergens. Here, we utilize imaging surface plasmon resonance (iSPR) in combination with antibody array for rapid, quantitative, and multianalyte food allergens detection. We demonstrate how the use of this technology provides a complete allergen profile within short measurement time and with adequate sensitivity. The successful applicability of this approach is demonstrated by analyzing cookies and dark chocolate products from different manufacturers. Hazelnut content of the tested food products is also determined by enzyme linked immunosorbent assay and is found to correlate well with the hazelnut content determined by iSPR. This newly developed method opens the door to automated and high-throughput allergen analysis, ultimately aiming at providing the consumer with safer food.
Directory of Open Access Journals (Sweden)
A. Pfister
Full Text Available Atmospheric temperature and humidity fields as well as information on other meteorological parameters are nowadays retrieved from radiance measurements recorded by operational meteorological satellites. Up to now, the inversion procedures used only take into account crude information on the topography of the Earth's surface. However, the applied radiative transfer codes have to consider the Earth's surface as the lower boundary of the atmospheric model and, therefore, need a more precise mean elevation and a classification of the roughness of the Earth's surface. The influence of the topography of the Earth surface on retrieved temperature profiles is studied by using a physico-statistical inversion method. An objective analysis is made of the more precise mean elevation and derivation of roughness parameters using a new high-resolution digital elevation model (DEM with a resolution of 500 m×500 m. By means of a geomorphological process and a newly developed topography rejection test, areas with a high surface roughness are localized and singled out. The influence of topography on the retrieved temperature profiles is illustrated by case studies. Changes are found predominantly in areas with a high variation of topography. Using the new high-resolution DEM and the topography rejection test, the geographical position of the calculated temperature profiles tends to be shifted towards areas with a small vertical variation of topography. The mean elevation determined by the new elevation model better characterizes the area observed. Hence, the temperature profiles can be calculated down to lower atmospheric levels. Furthermore, a guess profile better describing the atmospheric situation is selected by the more precise elevation. In addition, the temperature profiles obtained near the coast are improved considerably by the more precise determination of the surface property `sea' and `land,' respectively. Integration of an
Directory of Open Access Journals (Sweden)
A. Pfister
1995-03-01
Full Text Available Atmospheric temperature and humidity fields as well as information on other meteorological parameters are nowadays retrieved from radiance measurements recorded by operational meteorological satellites. Up to now, the inversion procedures used only take into account crude information on the topography of the Earth's surface. However, the applied radiative transfer codes have to consider the Earth's surface as the lower boundary of the atmospheric model and, therefore, need a more precise mean elevation and a classification of the roughness of the Earth's surface. The influence of the topography of the Earth surface on retrieved temperature profiles is studied by using a physico-statistical inversion method. An objective analysis is made of the more precise mean elevation and derivation of roughness parameters using a new high-resolution digital elevation model (DEM with a resolution of 500 m×500 m. By means of a geomorphological process and a newly developed topography rejection test, areas with a high surface roughness are localized and singled out. The influence of topography on the retrieved temperature profiles is illustrated by case studies. Changes are found predominantly in areas with a high variation of topography. Using the new high-resolution DEM and the topography rejection test, the geographical position of the calculated temperature profiles tends to be shifted towards areas with a small vertical variation of topography. The mean elevation determined by the new elevation model better characterizes the area observed. Hence, the temperature profiles can be calculated down to lower atmospheric levels. Furthermore, a guess profile better describing the atmospheric situation is selected by the more precise elevation. In addition, the temperature profiles obtained near the coast are improved considerably by the more precise determination of the surface property `sea' and `land,' respectively. Integration of an independent physical
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Sayde, C.; Higgins, C. W.; Perdosa, R.; Mahaffee, W.; Selker, J. S.
2016-12-01
Most critical atmospheric processes are a balance between buoyancy and shear, typically measured with the Richardson number. The fine scale motions associated with critical or near critical valued of Richardson number are understudied because the location and timing of these events are not known a-priori. To study these motions and quantify their importance for transport of heat momentum and water vapor in the atmospheric boundary layer, a distributed measurement approach for temperature and wind speed is required. Here we present the results of 12.5 cm resolution distributed profiling of wind speed and temperature for the first 37 m of the surface boundary layer. Distributed Temperature Sensing (DTS) technology was employed to measure temperature every 5 s and 12.5 cm along two Fiber Optics (FO) cables suspended from 37 m elevation to ground by a blimp anchored above a vineyard in the Willamette Valley, Oregon. 3D printed FO holders installed every 3 m along the suspended FO cables insured constant spacing of 7.5 cm between the two cables. The first FO cable was 0.9mm in diameter and reported ambient air temperature. The second FO cable was embedded in a thin stainless steel tube (1.3 mm OD) continuously heated by an electrical current to provide continuous wind speed measurements every 12.5 cm along the heated cable. Analogous to a hot-wire anemometer, this approach is based on the principal of velocity-dependent heat transfer from a heated surface. The co-located wind speed and ambient temperature measurements are used to calculate Richardson number with a spatial and temporal resolution of 12.5 cm and 5 s respectively for the first 37 m of the surface boundary layer. The equipment employed, including the heating system, which is available to all US scientists, was provided by CTEMPs.org thanks to the generous grant support from the National Science Foundation under Grant Number EAR 0930061. Any opinions, findings, and conclusions or recommendations expressed in
Integrated near-surface refraction and reflection profiling across the Carlsberg Fault, Denmark
Jorgensen, M. I.; Nielsen, L.; Thybo, H.; Fallesen, J.
2003-04-01
An integrated refraction and normal-incidence reflection seismic experiment has been conducted in order to resolve the near-surface part of the Carlsberg Fault in the easternmost part of the Danish basin. The primary objectives of the seismic experiment are to: 1) determine the fault structure; 2) image possible velocity contrasts across the fault; and 3) estimate how much the fault offsets the individual sedimentary layers at the different depth levels. The upper sedimentary strata in the study area consist of Cretaceous and Danian chalk and younger sediments dominated by sand and clay. The Carlsberg Fault is a NNW-SSE striking fault, which offsets the different sedimentary lithologies. It was probably created due to extensional stresses in a strike-slip system of the Sorgenfrei-Tornquist Zone, which is situated approximately 50 km east of the study area. Geodetic measurements indicate that the Carlsberg Fault may have been active during the last 100 years. The 1100 m long seismic reflection section, which was collected in 1995, shows a pronounced flower structure across the Carlsberg Fault, indicative of lateral movements along the fault plane. The seismic experiments were conducted in the SE part of Copenhagen, and urban noise was a major obstacle during collection of the refraction data in 2002. Nevertheless, both first arrivals and wide-angle reflections are prominent along the 3000 m long refraction line. From seismic travel time modelling we find that the P-wave velocity structure changes across the fault zone. The P-wave velocities in the chalk layers are relatively high (typically more than 3.0 km/s) compared to velocities from well log data of similar rock types elsewhere in the Danish area. The estimated velocity structure allows us to depth convert the reflection seismic sections. Vertical offsets of up to 90 m are observed for layers across the fault zone.
DEFF Research Database (Denmark)
Ruban, Andrei; Abrikosov, I. A.; Kats, D. Ya.
1994-01-01
We have calculated the electronic structure and segregation profiles of the (001) surface of random Cu-Ni alloys with varying bulk concentrations by means of the coherent potential approximation and the linear muffin-tin-orbitals method. Exchange and correlation were included within the local......-density approximation. Temperature effects were accounted for by means of the cluster-variation method and, for comparison, by mean-field theory. The necessary interaction parameters were calculated by the Connolly-Williams method generalized to the case of a surface of a random alloy. We find the segregation profiles...... to be oscillatory with a strong preference for Cu to segregate towards the surface of the alloy....
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Energy Technology Data Exchange (ETDEWEB)
Steinberger, R., E-mail: roland.steinberger@jku.at [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Celedón, C.E., E-mail: carlos.celedon@usm.cl [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Departamento de Física, Universidad Técnica Federico Santa María, Valaparaíso, Casilla 110-V (Chile); Bruckner, B., E-mail: barbara.bruckner@jku.at [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Roth, D., E-mail: dietmar.roth@jku.at [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Duchoslav, J., E-mail: jiri.duchoslav@jku.at [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Arndt, M., E-mail: martin.arndt@voestalpine.com [voestalpine Stahl GmbH, voestalpine-Straße 3, 4031 Linz (Austria); Kürnsteiner, P., E-mail: p.kuernsteiner@mpie.de [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); and others
2017-07-31
Highlights: • Investigation on the impact of residual gas prevailing in UHV chambers. • For some metals detrimental oxygen uptake could be observed within shortest time. • Totally different behavior was found: no changes, solely adsorption and oxidation. • The UHV residual gas may severely corrupt results obtained from depth profiling. • A well-considered data acquisition sequence is the key for reliable depth profiles. - Abstract: Depth profiling using surface sensitive analysis methods in combination with sputter ion etching is a common procedure for thorough material investigations, where clean surfaces free of any contamination are essential. Hence, surface analytic studies are mostly performed under ultra-high vacuum (UHV) conditions, but the cleanness of such UHV environments is usually overrated. Consequently, the current study highlights the in principle known impact of the residual gas on metal surfaces (Fe, Mg, Al, Cr and Zn) for various surface analytics methods, like X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES) and low-energy ion scattering (LEIS). The investigations with modern, state-of-the-art equipment showed different behaviors for the metal surfaces in UHV during acquisition: (i) no impact for Zn, even after long time, (ii) solely adsorption of oxygen for Fe, slight and slow changes for Cr and (iii) adsorption accompanied by oxide formation for Al and Mg. The efficiency of different counter measures was tested and the acquired knowledge was finally used for ZnMgAl coated steel to obtain accurate depth profiles, which exhibited before serious artifacts when data acquisition was performed in an inconsiderate way.
DEFF Research Database (Denmark)
Maibach, Julia; Younesi, Reza; Schwarzburger, Nele
2014-01-01
The formation of surface and interface layers at the electrodes is highly important for the performance and stability of lithium ion batteries. To unravel the surface composition of electrode materials, photoelectron spectroscopy (PES) is highly suitable as it probes chemical surface and interface...... properties with high surface sensitivity. Additionally, by using synchrotron-generated hard x-rays as excitation source, larger probing depths compared to in-house PES can be achieved. Therefore, the combination of in-house soft x-ray photoelectron spectroscopy and hard x-ray photoelectron spectroscopy...... (HAXPES) enables reliable and non-destructive depth profiling. Thus, detailed investigation of compositional gradients at electrode surfaces and interfaces from a sub-monolayer to several nanometer length scales can be performed. As this depth region is especially relevant for both electronic and ionic...
Stationary algorithmic probability
National Research Council Canada - National Science Library
Müller, Markus
2010-01-01
...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...
Directory of Open Access Journals (Sweden)
Siyuan He
2012-01-01
Full Text Available The range profiles of a two-dimension (2 D perfect electric conductor (PEC ship on a wind-driven rough sea surface are derived by performing an inverse discrete Fourier transform (IDFT on the wide band backscattered field. The rough sea surface is assuming to be a PEC surface. The back scattered field is computed based on EM numerical simulation when the frequencies are sampled between 100 MHz and 700 MHz. Considering the strong coupling interactions between the ship and sea, the complicated multipath effect to the range profile characteristics is fully analyzed based on the multipath imaging mechanisms. The coupling mechanisms could be explained by means of ray theory prediction and numerical extraction of the coupling currents. The comparison of the range profile locations between ray theory prediction and surface current simulation is implemented and analyzed in this paper. Finally, the influence of different sea states on the radar target signatures has been examined and discussed.
Factual and cognitive probability
Chuaqui, Rolando
2012-01-01
This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...
Evaluating probability forecasts
Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo
2011-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...
Steinberger, R.; Celedón, C. E.; Bruckner, B.; Roth, D.; Duchoslav, J.; Arndt, M.; Kürnsteiner, P.; Steck, T.; Faderl, J.; Riener, C. K.; Angeli, G.; Bauer, P.; Stifter, D.
2017-07-01
Depth profiling using surface sensitive analysis methods in combination with sputter ion etching is a common procedure for thorough material investigations, where clean surfaces free of any contamination are essential. Hence, surface analytic studies are mostly performed under ultra-high vacuum (UHV) conditions, but the cleanness of such UHV environments is usually overrated. Consequently, the current study highlights the in principle known impact of the residual gas on metal surfaces (Fe, Mg, Al, Cr and Zn) for various surface analytics methods, like X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES) and low-energy ion scattering (LEIS). The investigations with modern, state-of-the-art equipment showed different behaviors for the metal surfaces in UHV during acquisition: (i) no impact for Zn, even after long time, (ii) solely adsorption of oxygen for Fe, slight and slow changes for Cr and (iii) adsorption accompanied by oxide formation for Al and Mg. The efficiency of different counter measures was tested and the acquired knowledge was finally used for ZnMgAl coated steel to obtain accurate depth profiles, which exhibited before serious artifacts when data acquisition was performed in an inconsiderate way.
Coupled ADCPs can yield complete Reynolds stress tensor profiles in geophysical surface flows
Vermeulen, B.; Hoitink, A.J.F.; Sassi, M.G.
2011-01-01
We introduce a new technique to measure profiles of each term in the Reynolds stress tensor using coupled acoustic Doppler current profilers (ADCPs). The technique is based on the variance method which is extended to the case with eight acoustic beams. Methods to analyze turbulence from a single
Near-Surface Seismic Profiling Across the Active Carlsberg Fault, Denmark
Jorgensen, M. I.; Nielsen, L.; Fallesen, J.; Thybo, H.
2002-12-01
An integrated near-surface normal-incidence and wide-angle seismic experiment has been conducted across the active Carlsberg Fault in the easternmost part of the Danish basin, just east of Copenhagen. The purpose of the seismic experiment is to: 1) determine the fault structure; 2) image possible seismic velocity contrasts across the fault; and 3) estimate how much the fault offsets the individual sedimentary layers at the different depth levels. The origin of the Carlsberg Fault is probably related to extensional stresses in a strike-slip system caused by movements in the Sorgenfrei-Tornquist Zone, which is a 20-50 km wide fault zone located approximately 50 km east of Copenhagen. In the study area, the upper sedimentary strata consist of Cretaceous and Danian chalk layers as well as younger sediments, which predominantly consist of sand and clay. The fault runs in an overall NNW-SSE direction, and it penetrates the various sedimentary strata. Geodetic measurements show that the fault has been active within the last 100 years. The normal-incidence data were collected along an 1100 m long line perpendicular to the strike of the fault with a shot spacing of 12 m and a receiver spacing of 6 m. The reflection image reveals a clear flower structure in the upper 400 ms of the section indicating that substantial horizontal movement has taken place along the Carlsberg Fault. This flower structure is relatively narrow at 350 ms depth, whereas it unfolds to a width of about 300 m in the uppermost layers. The wide-angle data were collected along a 2000 m long line with shot and receiver spacings of 100 m and 10 m, respectively. They provide good velocity control of the sedimentary layers and allow for depth conversion of the reflection seismic image. Furthermore the wide-angle data have the potential of providing back-scattered reflections from the fault planes. GPR measurements have been planned in order to constrain the very shallow and recent movements along the fault.
Probability distributions for the magnification of quasars due to microlensing
Wambsganss, Joachim
1992-01-01
Gravitational microlensing can magnify the flux of a lensed quasar considerably and therefore possibly influence quasar source counts or the observed quasar luminosity function. A large number of distributions of magnification probabilities due to gravitational microlensing for finite sources are presented, with a reasonable coverage of microlensing parameter space (i.e., surface mass density, external shear, mass spectrum of lensing objects). These probability distributions were obtained from smoothing two-dimensional magnification patterns with Gaussian source profiles. Different source sizes ranging from 10 exp 14 cm to 5 x 10 exp 16 cm were explored. The probability distributions show a large variety of shapes. Coefficients of fitted slopes for large magnifications are presented.
Al-Achi, Antoine; Baghat, Tushar; Chukwubeze, Onah; Dembla, Ishwin
2007-01-01
Knowledge of the physical characteristics of commercially available over-the-counter preparations can aid the compounding pharmacist in preparing medication. In this study, 15 over-the-counter products were studied with regard to their specific gravity, surface tension, pH, and rheologic profile. The specific gravities of all the products were greater than 1, with the exceptions of Nivea Lotion and rubbing alcohol, which were less than 1. The majority of the products had an acidic pH. With the exception of two products, Citrucel and Chloraseptic, all products demonstrated a surface tension value less than that of water (72.8 dynes/cm). Chloraseptic had the lowest Newtonian viscosity (1.27 cPs), whereas Vicks DayQuil had the highest (098.86 cPs). Citrucel exhibited dilatant-type flow; Suave Shampoo, herbal shampoo, Tangerine Tickle Herbal Shampoo, and Metamucil pseudoplastic flow; the remaining non-Newtonian formulations, plastic flow profiles.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Efficient probability sequence
Regnier, Eva
2014-01-01
A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...
Efficient probability sequences
Regnier, Eva
2014-01-01
DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
2011-10-01
LHC points, d(m) is machine zero at these points, and hence yielding a zero response surface! In fact, for both Gaussian and Cauchy cases, we have...tested that d(m) is close to machine zero even for 100,000 LHC points. Again, the curse of dimensionality is in action. This observation also suggests that...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
Modelling of composition and stress profiles in low temperature surface engineered stainless steel
DEFF Research Database (Denmark)
Jespersen, Freja Nygaard; Hattel, Jesper Henri; Somers, Marcel A. J.
2015-01-01
Thermochemical surface engineering by nitriding/carburizing of stainless steel causes a surface zone of expanded austenite, which improves the wear resistance of the stainless steel while preserving the stainless behavior. As a consequence of the thermochemical surface engineering, huge residual ...
Meng, Rui; Li, Ke; Chen, Zhe; Shi, Chen
2016-02-01
The effect of surface charges on the cellular uptake rate and drug release profile of tetrandrine-loaded poly(lactic-co-glycolic acid) (PLGA) nanoparticles (TPNs) was studied. Stabilizer-free nanoprecipitation method was used in this study for the synthesis of TPNs. A typical layer-by-layer approach was applied for multi-coating particles' surface with use of poly(styrene sulfonate) sodium salt (PSS) as anionic layer and poly(allylamine hydrochloride) (PAH) as cationic layer. The modified TPNs were characterized by different physicochemical techniques such as Zeta sizer, scanning electron microscopy and transmission electron microscopy. The drug loading efficiency, release profile and cellular uptake rate were evaluated by high performance liquid chromatography and confocal laser scanning microscopy, respectively. The resultant PSS/PAH/PSS/PAH/TPNs (4 layers) exhibited spherical-shaped morphology with the average size of 160.3±5.165 nm and zeta potential of-57.8 mV. The encapsulation efficiency and drug loading efficiency were 57.88% and 1.73%, respectively. Multi-layer coating of polymeric materials with different charges on particles' surface could dramatically influence the drug release profile of TPNs (4 layers vs. 3 layers). In addition, variable layers of surface coating could also greatly affect the cellular uptake rate of TPNs in A549 cells within 8 h. Overall, by coating particles' surface with those different charged polymers, precise control of drug release as well as cellular uptake rate can be achieved simultaneously. Thus, this approach provides a new strategy for controllable drug delivery.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Rostkier-Edelstein, D.; Hacker, J. P.
2009-09-01
A long-term goal of this work is to find an efficient system for probabilistic planetary boundary layer (PBL) nowcasting that can be deployed wherever surface observations are present. One approach showing promise is the use of a single column model (SCM) and ensemble filter (EF) data assimilation techniques. Earlier work showed that surface observations can be an important source of information with an SCM and an EF. Here we extend that work to quantify the deterministic and probabilistic skill of ensemble SCM predictions with added complexity. Although it is appealing to add additional physics and dynamics to the SCM model it is not immediately clear that additional complexity will improve the performance of a PBL nowcasting system based on a simple model. We address this question with regard to treatment of surface assimilation, radiation in the column, and also advection to account for realistic 3D dynamics (a timely WRF prediction). We adopt factor separation analysis to quantify the individual contribution of each model component to the deterministic and probabilistic skill of the system, as well as any beneficial or detrimental interactions between them. Deterministic skill of the system is evaluated through the mean absolute error, and probabilistic skill through the Brier Skill Score (BSS) and the area under the relative operating characteristic (ROC) curve (AUR). The BSS is further decomposed into both a reliability and resolution term to understand the trade-offs in different components of probabilistic skill. An alternative system based on climatological covariances and surface observations is used as a reference to assess the real utility of the flow-dependent covariances estimated with the ensemble system. In essence it is a dressing technique, whereby a deterministic 3D mesoscale forecast (e.g. WRF) is corrected with surface forecast errors and covariances computed from a distribution of available historical mesoscale forecasts. The adjusted profile
Directory of Open Access Journals (Sweden)
Craig A Gedye
Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell
Oxygen boundary crossing probabilities.
Busch, N A; Silver, I A
1987-01-01
The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
In All Probability, Probability is not All
Helman, Danny
2004-01-01
The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.
Tumour control probability in cancer stem cells hypothesis.
Dhawan, Andrew; Kohandel, Mohammad; Hill, Richard; Sivaloganathan, Sivabal
2014-01-01
The tumour control probability (TCP) is a formalism derived to compare various treatment regimens of radiation therapy, defined as the probability that given a prescribed dose of radiation, a tumour has been eradicated or controlled. In the traditional view of cancer, all cells share the ability to divide without limit and thus have the potential to generate a malignant tumour. However, an emerging notion is that only a sub-population of cells, the so-called cancer stem cells (CSCs), are responsible for the initiation and maintenance of the tumour. A key implication of the CSC hypothesis is that these cells must be eradicated to achieve cures, thus we define TCPS as the probability of eradicating CSCs for a given dose of radiation. A cell surface protein expression profile, such as CD44high/CD24low for breast cancer or CD133 for glioma, is often used as a biomarker to monitor CSCs enrichment. However, it is increasingly recognized that not all cells bearing this expression profile are necessarily CSCs, and in particular early generations of progenitor cells may share the same phenotype. Thus, due to the lack of a perfect biomarker for CSCs, we also define a novel measurable TCPCD+, that is the probability of eliminating or controlling biomarker positive cells. Based on these definitions, we use stochastic methods and numerical simulations parameterized for the case of gliomas, to compare the theoretical TCPS and the measurable TCPCD+. We also use the measurable TCP to compare the effect of various radiation protocols.
Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish
2015-10-01
Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.
DEFF Research Database (Denmark)
Paulson, C.A.
1970-01-01
Analytical expressions which specify non-dimensionalized wind speed and potential temperature gradients as functions of stability are integrated. The integrated equations are tested against Swinhank's wind and temperature profiles measured at Kerang, Australia. It is found that a representation s...
Ma, Shuang; Yi, Shengzhen; Chen, Shenghao; Wang, Zhanshan
2014-11-01
Monochromatic energy multilayer Kirkpatrick-Baez microscope is one of key diagnostic tools for researches on inertial confinement fusion. It is composed by two orthogonal concave spherical mirrors with small curvature and aperture, and produce the image of an object by collecting X-rays in each orthogonal direction, independently. Accurate measurement of radius of curvature of concave spherical mirrors is very important to achieve its design optical properties including imaging quality, optical throughput and energy resolution. However, it is difficult to measure the radius of curvature of spherical optical surfaces with small curvature and aperture by conventional methods, for the produced reflective intensity of glass is too low to correctly test. In this paper, we propose an improved measuring method of optical profiler to accomplish accurate measurement of radius of curvature of spherical optical surfaces with small curvature and aperture used in the monochromatic energy multilayer Kirkpatrick-Baez microscope. Firstly, we use a standard super-smooth optical flat to calibrate reference mirror before each experiment. Following, deviation of central position between measurement area and interference pattern is corrected by the theory of Newton's rings, and the zero-order fringe position is derived from the principle of interference in which surface roughness has minimum values in the position of zero light path difference. Measured results by optical profiler show the low relative errors and high repeatability. Eventually, an imaging experiment of monochromatic energy multilayer Kirkpatrick-Baez microscope determines the measurement accuracy of radius of curvature.
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Energy Technology Data Exchange (ETDEWEB)
Karam, J-C [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Grucker, J [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Boustimi, M [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Vassilev, G [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Reinhardt, J [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Mainos, C [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Bocvarski, V [Institute of Physics, Pregrevica, 11000-Zemun, Belgrade (Serbia and Montenegro); Robert, J [Laboratoire Aime Cotton, Bat. 505, Universite Paris-Sud, 91405-Orsay Cedex (France); Baudon, J [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France); Perales, F [Laboratoire de Physique des Lasers (UMR-CNRS 7538), Universite Paris 13, Avenue J B Clement, 93430-Villetaneuse (France)
2006-04-28
The interaction at mean distance (a few tens up to a few hundreds of a{sub 0}), i.e. in the van der Waals interaction range, between metastable nitrogen molecules, N{sub 2}* (A{sup 3}{sigma}{sub u}{sup +}), and the slit edges of a micro-slit copper grating depends on both the molecular orientation and the internuclear distance in the molecule. Such an interaction is able to induce rotational and vibrational transitions. Endo-energetic transitions (v {yields} v + 1, v ranging from 5 to 10) are observed by means of a time-of-flight technique combined with an angular distribution measurement. By setting the grating plane at an angle with respect to the incident direction, different from that imposed by ideally planar slit walls, it is shown that the angular distribution of the inelastic process reveals a departure of the surface from an ideal plane. Assuming a regular evolution of the tangent plane along the surface profile, a mean wall profile can be derived from this distribution.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Directory of Open Access Journals (Sweden)
Laktineh Imad
2010-04-01
Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Tate, G. W.; Willett, S.; McQuarrie, N.; Goren, L.; Fox, M.
2013-12-01
While river profile analyses have long been used to evaluate the development of landforms, recent advances in analyzing drainage networks have significantly improved the ability to positively link stream profiles with surface uplift. In one such method, Perron and Royden (2012) define the value chi, an integral quantity based on the steady-state stream power equation which aids in determining the conformity of rivers and drainage basins to steady-state behavior. East Timor is an ideal location to test new methods using chi, as it is an active and unglaciated orogen with independent constraints of the deformational history through thermochronology and structural geology. We utilize the calculation of chi in our analyses of the drainage network to provide new constraints on the most recent uplift history of the island of Timor. Discontinuities in chi across drainage divides imply different steady state baselevel for hillslopes and therefore active migration of the divide. We confirm this by noting visible landslides in satellite images and asymmetries in hillslope steepness. Analyses of chi and elevation reveal in some locations that tributaries within a single basin have experienced distinctly different histories, documenting instances where previous river capture has occurred. In other locations the relationships between chi and elevation along single rivers denote spatial changes in surface uplift rate. Many of these observations from the drainage network correspond well to patterns of recent exhumation identified from thermochronologic analyses as well as structural constraints from field mapping and balanced cross-sections. Much of the fastest exhumation on the island (as indicated by zircon (U-Th)/He ages of 1.5-3.8 Ma and modeled exhumation rates of 1-3 mm/yr) is in the hinterland slate belt, which also contains the most stream profile remnants of paleo-capture events. Many locations of active river capture correspond well to independently constrained
Buckland, Catherine; Bailey, Richard; Thomas, David
2017-04-01
Two billion people living in drylands are affected by land degradation. Sediment erosion by wind and water removes fertile soil and destabilises landscapes. Vegetation disturbance is a key driver of dryland erosion caused by both natural and human forcings: drought, fire, land use, grazing pressure. A quantified understanding of vegetation cover sensitivities and resultant surface change to forcing factors is needed if the vegetation and landscape response to future climate change and human pressure are to be better predicted. Using quartz luminescence dating and statistical changepoint analysis (Killick & Eckley, 2014) this study demonstrates the ability to identify step-changes in depositional age of near-surface sediments. Lx/Tx luminescence profiles coupled with statistical analysis show the use of near-surface sediments in providing a high-resolution record of recent system response and aeolian system thresholds. This research determines how the environment has recorded and retained sedimentary evidence of drought response and land use disturbances over the last two hundred years across both individual landforms and the wider Nebraska Sandhills. Identifying surface deposition and comparing with records of climate, fire and land use changes allows us to assess the sensitivity and stability of the surface sediment to a range of forcing factors. Killick, R and Eckley, IA. (2014) "changepoint: An R Package for Changepoint Analysis." Journal of Statistical Software, (58) 1-19.
Time-kill profiles and cell-surface morphological effects of crude ...
African Journals Online (AJOL)
MK1201 mycelial extract on the viability and cell surface morphology of methicillin-susceptible Staphylococcus aureus (MSSA) and methicillin-resistant Staphylococcus aureus (MRSA). Methods: Time-kill assays were conducted by incubating test ...
A 3D Laser Profiling System for Rail Surface Defect Detection
Xiong, Zhimin; Li, Qingquan; Mao, Qingzhou; Zou, Qin
2017-01-01
Rail surface defects such as the abrasion, scratch and peeling often cause damages to the train wheels and rail bearings. An efficient and accurate detection of rail defects is of vital importance for the safety of railway transportation. In the past few decades, automatic rail defect detection has been studied; however, most developed methods use optic-imaging techniques to collect the rail surface data and are still suffering from a high false recognition rate. In this paper, a novel 3D las...
Rosas, Jorge
2017-09-26
The land surface temperature (LST) represents a critical element in efforts to characterize global surface energy and water fluxes, as well as being an essential climate variable in its own right. Current satellite platforms provide a range of spatial and temporal resolution radiance data from which LST can be determined. One of the most complete records of data comes via the Landsat series of satellites, which provide a continuous sequence that extends back to 1982. However, for much of this time, Landsat thermal data were provided through a single broadband thermal channel, making surface temperature retrieval challenging. To fully exploit the valuable time-series of thermal information that is available from these satellites requires efforts to better describe and understand the accuracy of temperature retrievals. Here, we contribute to these efforts by examining the impact of atmospheric correction on the estimation of LST, using atmospheric profiles derived from a range of in-situ, reanalysis, and satellite data. Radiance data from the thermal infrared (TIR) sensor onboard Landsat 8 was converted to LST by using the MODTRAN version 5.2 radiative transfer model, allowing the production of an LST time series based upon 28 Landsat overpasses. LST retrievals were then evaluated against in-situ thermal measurements collected over an arid zone farmland comprising both bare soil and vegetated surface types. Atmospheric profiles derived from AIRS, MOD07, ECMWF, NCEP, and balloon-based radiosonde data were used to drive the MODTRAN simulations. In addition to examining the direct impact of using various profile data on LST retrievals, randomly distributed errors were introduced into a range of forcing variables to better understand retrieval uncertainty. Results indicated differences in LST of up to 1 K for perturbations in emissivity and profile measurements, with the analysis also highlighting the challenges in modeling aerosol optical depth (AOD) over arid lands and
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Difficulties related to Probabilities
Rosinger, Elemer Elad
2010-01-01
Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.
Indian Academy of Sciences (India)
casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...
Dynamic update with probabilities
Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld
2009-01-01
Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant
Elements of quantum probability
Kummerer, B.; Maassen, H.
1996-01-01
This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Frič, Roman; Papčo, Martin
2017-06-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Candida albicans shaving to profile human serum proteins on hyphal surface
Directory of Open Access Journals (Sweden)
Elvira eMarin
2015-12-01
Full Text Available Candida albicans is a human opportunistic fungus and it is responsible for a wide variety of infections, either superficial or systemic. C. albicans is a polymorphic fungus and its ability to switch between yeast and hyphae is essential for its virulence. Once C. albicans obtains access to the human body, the host serum constitutes a complex environment of interaction with C. albicans cell surface in bloodstream. To draw a comprehensive picture of this relevant step in host-pathogen interaction during invasive candidiasis, we have optimized a gel-free shaving proteomic strategy to identify both, human serum proteins coating C. albicans cells and fungi surface proteins simultaneously. This approach was carried out with normal serum (NS and heat inactivated serum (HIS. We identified 214 human and 372 C. albicans unique proteins. Proteins identified in C. albicans included 147 which were described as located at the cell surface and 52 that were described as immunogenic. Interestingly, among these C. albicans proteins, we identified 23 GPI-anchored proteins, Gpd2 and Pra1, which are involved in complement system evasion and 7 other proteins that are able to attach plasminogen to C. albicans surface (Adh1, Eno1, Fba1, Pgk1, Tdh3, Tef1 and Tsa1. Furthermore, 12 proteins identified at the C. albicans hyphae surface induced with 10% human serum were not detected in other hypha-induced conditions. The most abundant human proteins identified are involved in complement and coagulation pathways. Remarkably, with this strategy, all main proteins belonging to complement cascades were identified on the C. albicans surface. Moreover, we identified immunoglobulins, cytoskeletal proteins, metabolic proteins such as apolipoproteins and others. Additionally, we identified more inhibitors of complement and coagulation pathways, some of them serpin proteins (serine protease inhibitors, in HIS versus NS. On the other hand, we detected a higher amount of C3 at the C
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Krawczyk, M; Levy, J C S; Mercier, D
2003-01-01
Spin-wave excitations in ferromagnetic layered composite (AB centre dot centre dot centre dot BA; A and B being different homogeneous ferromagnetic materials) are analysed theoretically, by means of the transfer matrix approach. The properties of multilayer spin-wave mode profiles are discussed in relation to multilayer characteristics, such as the filling fraction and the exchange or magnetization contrast; also, surface spin pinning conditions and dipolar interactions are taken into account. The interface conditions are satisfied by introducing an effective exchange field expressed by interface gradients of the exchange constant and the magnetization. This approach provides an easy way to find frequencies and amplitudes of standing spin waves in the multilayer. The developed theory is applied to interpretation of spin wave resonance (SWR) spectra obtained experimentally by Chambers et al in two systems: a bilayer Fe/Ni and a trilayer Ni/Fe/Ni, in perpendicular (to the multilayer surface) configuration of th...
Glauser, Gaetan; Schweizer, Fabian; Turlings, Ted C J; Reymond, Philippe
2012-01-01
The analysis of glucosinolates (GS) is traditionally performed by reverse-phase liquid chromatography coupled to ultraviolet detection after a time-consuming desulphation step, which is required for increased retention. Simpler and more efficient alternative methods that can shorten both sample preparation and analysis are much needed. To evaluate the feasibility of using ultrahigh-pressure liquid chromatography coupled to quadrupole time-of-flight mass spectrometry (UHPLC-QTOFMS) for the rapid profiling of intact GS. A simple and short extraction of GS from Arabidopsis thaliana leaves was developed. Four sub-2 µm reverse-phase columns were tested for the rapid separation of these polar compounds using formic acid as the chromatographic additive. High-resolution QTOFMS was used to detect and identify GS. A novel charged surface hybrid (CSH) column was found to provide excellent retention and separation of GS within a total running time of 11 min. Twenty-one GS could be identified based on their accurate mass as well as isotopic and fragmentation patterns. The method was applied to determine the changes in GS content that occur after herbivory in Arabidopsis. In addition, we evaluated its applicability to the profiling of other Brassicaceae species. The method developed can profile the full range of GS, including the most polar ones, in a shorter time than previous methods, and is highly compatible with mass spectrometric detection. Copyright © 2012 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
G. Skorulski
2010-07-01
Full Text Available The theoretical and experimental method of optimization the aluminium billet’s contact surface during extrusion have been presented inthis paper. The theoretical assumption, based on welding criteria, have been confirmed by experimental researches. The technique ofmeasurement has been shown as well. Experiments are made using plasticine as a substiute material. Some kind of different variants have been investigated. The theory and experiments have been provided to optimize the modeling shape and may help in design and technology.The theory has been tested experimentally using a plasticine as a substitute material and a plexiglass die such that the velocity fields at the surfaces could be observed and measured during plastic flow, allowing the empirical coefficients in the mathematical formulation to be estimated. On the basis of the theory and experiments an optimal billet’s contact surface was proposed.
Directory of Open Access Journals (Sweden)
Larissa Belov
2016-04-01
Full Text Available Extracellular vesicles (EV are membranous particles (30–1,000 nm in diameter secreted by cells. Important biological functions have been attributed to 2 subsets of EV, the exosomes (bud from endosomal membranes and the microvesicles (MV; bud from plasma membranes. Since both types of particles contain surface proteins derived from their cell of origin, their detection in blood may enable diagnosis and prognosis of disease. We have used an antibody microarray (DotScan to compare the surface protein profiles of live cancer cells with those of their EV, based on their binding patterns to immobilized antibodies. Initially, EV derived from the cancer cell lines, LIM1215 (colorectal cancer and MEC1 (B-cell chronic lymphocytic leukaemia; CLL, were used for assay optimization. Biotinylated antibodies specific for EpCAM (CD326 and CD19, respectively, were used to detect captured particles by enhanced chemiluminescence. Subsequently, this approach was used to profile CD19+ EV from the plasma of CLL patients. These EV expressed a subset (~40% of the proteins detected on CLL cells from the same patients: moderate or high levels of CD5, CD19, CD31, CD44, CD55, CD62L, CD82, HLA-A,B,C, HLA-DR; low levels of CD21, CD49c, CD63. None of these proteins was detected on EV from the plasma of age- and gender-matched healthy individuals.
de Kruif, Jan Kendall; Khoo, Jiyi; Bravo, Roberto; Kuentz, Martin
2013-03-01
Quality by design is an important concept, but only limited research has been invested in concentrated pharmaceutical suspensions. A need exists for novel analytical tools to thoroughly characterize the drug as well as its aggregated particle structure in suspension. This work focuses on lipid-based pharmaceutical suspensions for filling of capsules. A rheological approach, namely the fractal concept of flocculation, is introduced to the pharmaceutical field. The model drug mebeverine hydrochloride was first physicochemically analyzed. A special aim was to study the surface energy profiles using inverse gas chromatography as a critical characteristic for the suspension's rheological behavior. Suspensions were manufactured in laboratory process equipment while applying different homogenization speeds. Flow curves of the final suspensions were measured using a cone-and-plate rheometer. As a result, surface energy profiles revealed differences from one mebeverine lot to another. Different homogenization intensities greatly affected the viscosity and the Mooney model was able to predict experimental values as a function of the drug volume fraction. The fractal concept of flocculation characterized mebeverine in suspension and a slight increase of fractal dimension was noted when homogenization speed was increased. It was concluded that the introduced concepts have large potential for designing quality into concentrated pharmaceutical suspensions. Copyright © 2012 Wiley Periodicals, Inc.
Quantitative transfer of polar analytes on a solid surface to a liquid matrix in MALDI profiling.
Park, Kyung Man; Moon, Jeong Hee; Lee, Seong Hoon; Kim, Myung Soo
2016-12-01
In profiling of a specimen by matrix-assisted laser desorption ionization (MALDI) using a solid matrix, the solvent of the matrix solution extracts an analyte(s). A quantitative profiling cannot be achieved if the solvent evaporates before the complete extraction of the analyte. The extraction can become more quantitative when a liquid matrix dissolved in a solvent is used, which remains a liquid even after the evaporation of the solvent. To check this, radii of an analyte circle (rA ), a matrix solution drop (rD ) and a liquid matrix (rM ) remaining after the solvent evaporation were controlled. Three types of samples were prepared, case A (rA , rD matrix layer determined by MALDI was the same as the prepared amount inside the analyte circle. In case B, the analyte amount was the same as the amount inside the matrix circle. Only the analytes in contact with the liquid matrix layer, not more and not less, are transferred to the matrix layer. In case C, the analyte amount was greater than the amount inside the matrix circle, presumably because some of the analyte outside the matrix circle was dissolved by the solvent of the matrix solution. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Hoff, R. M.
2014-12-01
One research goal of the Deriving Information on Surface Conditions from COlumn and VERtically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) mission was to determine sufficient column profile measurements to relate column integrated quantities such as Aerosol Optical Depth to surface concentrations. I will review the relationship between AOD and PM2.5 at the surface. DISCOVER-AQ in Baltimore, the San Joaquin Valley, Houston and Denver revealed quite different conditions for determining this relationship. In each case, the surface reflectivity made determination of aerosol optical depth challenging, but upward looking columns of aerosol optical depth from sunphotometers provided confirmation of the AOD results from space. In Baltimore, AOD fields reflected PM2.5 concentrations well. In California, however, the low boundary layer heights and dominance of nitrate and organic aerosols made the AOD fields less predictive of PM2.5. In California and Colorado, hydration of the aerosol varied dramatically with aerosol type (especially smoke and dust) and revealed that without an understanding of the degree of aerosol hydration with aerosol composition, the relationship between AOD and PM2.5 will continue to be a challenge. Model predictions in the Baltimore-Washington study are relatively disappointing in helping define the needed physics between the optical and microphysical properties. An overview of the measurements from DISCOVER-AQ which will help define the needed information in a more general case in the future will be given.
Birch, Stacy L
2016-07-01
The purpose of the present study was to identify and characterize surface and phonological subgroups of readers among college students with a prior diagnosis of developmental reading disability (RD). Using a speeded naming task derived from Castles and Coltheart's subtyping study, we identified subgroups of readers from among college students with RD and then compared them on a number of component reading tasks. Most of our adults with RD showed a discrepancy in lexical versus sublexical reading skills. The majority of classified individuals were in the phonological dyslexia group, and this group's performance was worse than that of other groups on a range of reading-related tasks. Specifically, being relatively less skilled at reading nonwords compared to irregular words was associated with deficits in both sublexical and lexical tasks, and with unique deficits compared to the surface dyslexia group not only in an independent measure of phonological coding but also in spelling, rapid automatized naming, and speeded oral reading. The surface dyslexia group was small, and the pattern of results for these readers was not consistent with the predicted profile of a specific deficit in lexical and automatized reading processes. Our surface group did not show reduced skill in lexical mechanisms specifically, nor any unique deficit compared to the phonological group. These results seem more supportive of models of reading that place phonological processing impairments at the core of RD, with all other impairments being clearly subsidiary. © Hammill Institute on Disabilities 2014.
Sicart, J.; Litt, M.
2012-12-01
The turbulent fluxes remain poorly understood on tropical glaciers. Studies based on the bulk method have shown that sublimation can be high during the dry season, reducing the energy available for melting. However, uncertainties on the bulk method are large, especially when katabatic flows cause a wind speed maximum at low height. Wind and temperature data from an 8-level 6-m mast positioned at 5060 m asl in the ablation area of the Zongo Glacier, Bolivia (16°S), were collected during a one-month period in the dry season of 2007. Concomitant measurements of radiation fluxes and eddy covariance turbulent fluxes were conducted. The surface roughness lengths for temperature and momentum were calculated using the profile and the eddy covariance methods at the hourly time scale. The measurement period was characterized by low synoptic forcing conditions and katabatic wind prevailed at night and most of the day. Katabatic flows were often associated with a wind speed maximum at a height of about 2-3 m and with a strong temperature inversion. Near-neutral profiles were selected to avoid the presence of the katabatic wind speed maximum. Results indicate z0 values of about 3 mm and zT values of about 0.2 mm, in rough agreement with terrain observations. However the scatter in the zT values is large indicating large random errors. The relation between the ratio zT/z0 and the roughness Reynolds number is in rough agreement with the surface renewal model. However, this relation turns out to be mostly due to spurious self-correlation because of the shared variable z0 in zT/z0 and Re*. Finally, the random and systematic errors on the roughness lengths derived from the profile measurements were briefly investigated. The results emphasize the need of accurate measurements of the sensor heights to obtain unbiased roughness lengths.
Sicart, Jean-Emmanuel; Litt, Maxime; Ben Tahar, Vanessa
2013-04-01
The turbulent fluxes remain poorly understood on tropical glaciers. Studies based on the bulk method have shown that sublimation can be high during the dry season, reducing the energy available for melting. However, uncertainties on the bulk method are large, especially when katabatic flows cause a wind speed maximum at low height. Wind and temperature data from an 8-level 6-m mast positioned at 5060 m a.s.l. in the ablation area of the Zongo Glacier, Bolivia (16°S), were collected during a one-month period in the dry season of 2007. Concomitant measurements of radiation fluxes and eddy covariance turbulent fluxes were conducted. The surface roughness lengths for temperature (zT) and momentum (z0) were calculated using the profile and the eddy covariance methods at the hourly timescale. The measurement period was characterized by low synoptic forcing conditions and katabatic wind prevailed at night and most of the day. Katabatic flows were often associated with a wind speed maximum at a height of about 2-3 m and with a strong temperature inversion. Near-neutral profiles were selected to avoid the presence of the katabatic wind speed maximum. Results indicate z0 values of about 3 mm and zT values of about 0.2 mm, in rough agreement with terrain observations. However the scatter in the zT values is large indicating large random errors. The relation between the ratio zT/z0 and the roughness Reynolds number (Re*) is in rough agreement with the surface renewal model. However, this relation turns out to be mostly due to spurious self-correlation because of the shared variable z0 in zT/z0 and Re*. Finally, the random and systematic errors on the roughness lengths derived from the profile measurements were briefly investigated. The results emphasize the need of accurate measurements of the sensor heights to obtain unbiased roughness lengths.
Toe clearance and velocity profiles of young and elderly during walking on sloped surfaces
Directory of Open Access Journals (Sweden)
Begg Rezaul K
2010-04-01
Full Text Available Abstract Background Most falls in older adults are reported during locomotion and tripping has been identified as a major cause of falls. Challenging environments (e.g., walking on slopes are potential interventions for maintaining balance and gait skills. The aims of this study were: 1 to investigate whether or not distributions of two important gait variables [minimum toe clearance (MTC and foot velocity at MTC (VelMTC] and locomotor control strategies are altered during walking on sloped surfaces, and 2 if altered, are they maintained at two groups (young and elderly female groups. Methods MTC and VelMTC data during walking on a treadmill at sloped surfaces (+3°, 0° and -3° were analysed for 9 young (Y and 8 elderly (E female subjects. Results MTC distributions were found to be positively skewed whereas VelMTC distributions were negatively skewed for both groups on all slopes. Median MTC values increased (Y = 33%, E = 7% at negative slope but decreased (Y = 25%, E = 15% while walking on the positive slope surface compared to their MTC values at the flat surface (0°. Analysis of VelMTC distributions also indicated significantly (p th percentile (Q1 values in the elderly at all slopes. Conclusion The young displayed a strong positive correlation between MTC median changes and IQR (interquartile range changes due to walking on both slopes; however, such correlation was weak in the older adults suggesting differences in control strategies being employed to minimize the risk of tripping.
NEXAFS Depth Profiling of Surface Segregation in Block Copolymer Thin Films
2010-01-01
filtered using a 450 nm syringe filter and spin -coated on silicon wafers. A Cee model 100CB spin coater was used at a rotational speed of 2000 rpm for 30 s...at the surface is of thermodynamic origin or of kinetic origin (that is, dependent on polarity of the solvent used for spin coating, solvent
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Elements of quantum probability
Kummerer, B.; Maassen, Hans
1996-01-01
This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of quantum coin tosses are discussed, closely related to V.F.R....
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Kang Min, Eden T.; Watt, Sook May; Sreemathy, Parthasarathy; Huang, Lei; Asundi, Anand
2013-06-01
The Chinese magic mirror is an ancient convex bronze mirror, it reflects parallel light rays to form a unique image within the reflected patch of light by altering the reflected ray paths. Using Phase Measuring Reflectometry (PMR), surface irregularities of a micron range were found to be present on the mirror; these irregularities concentrate and disperse reflected light rays, giving rise to brighter and darker patches on the reflected image, forming a contrast, allowing the unique pattern to be observed. To ascertain location and nature of the surface defects that come in forms of indentations and raised platforms, other measurement techniques were employed. Reverse engineering then facilitated the exploration of reproduction of a very own original Chinese Magic Mirror with the use of optical principles behind the mirror.
Walczyk, Wiktoria; Schönherr, Holger
2014-10-14
The interactions between argon surface nanobubbles and AFM tips on HOPG (highly oriented pyrolitic graphite) in water and the concomitant nanobubble deformation were analyzed as a function of position on the nanobubbles in a combined tapping mode and force-volume mode AFM study with hydrophilic and hydrophobic AFM tips. On the basis of the detailed analysis of force-distance curves acquired on the bubbles, we found that for hydrophobic tips the bubble interface may jump toward the tip and that the tip-bubble interaction strength and the magnitude of the bubble deformation were functions of vertical and horizontal position of the tip on the bubble and depended on the bubble size and tip size and functionality. The spatial variation is attributed to long-range attractive forces originating from the substrate under the bubbles, which dominate the interaction at the bubble rim. The nonuniform bubble deformation leads to a nonuniform underestimation of the bubble height, width, and contact angle in conventional AFM height data. In particular, scanning with a hydrophobic tip resulted in severe bubble deformation and distorted information in the AFM height image. For a typical nanobubble, the upward deformation may extend up to tens of nanometers above the unperturbed bubble height, and the lateral deformation may constitute 20% of the bubble width. Therefore, only scanning with a hydrophilic tip and no direct contact between the tip and the bubble may reduce nanobubble deformation and provide reliable AFM images that can be used to estimate adequately the unperturbed nanobubble dimensions. The deformation of the bubble shape and underestimation of the bubble size lead to the conclusion that the profile of surface nanobubbles is much closer than previously thought to a nearly flat bubble profile and hence that the Laplace pressure is much closer to the atmospheric pressure. Together with line pinning, this may explain the long nanobubble lifetimes observed previously. The
Gool, Elmar L; Stojanovic, Ivan; Schasfoort, Richard B M; Sturk, Auguste; van Leeuwen, Ton G; Nieuwland, Rienk; Terstappen, Leon W M M; Coumans, Frank A W
2017-10-01
Identification, enumeration, and characterization of extracellular vesicles (EVs) are hampered by the small size of EVs, a low refractive index, and low numbers of antigens on their surface. We investigated the potential of a 48-multiplex surface plasmon resonance imaging (SPRi) system to perform EV phenotyping. Antigen surface density of 11 antigens was measured on the human breast cancer cell lines HS578T, MCF7, and SKBR3 and their EVs by use of both SPRi and the widely used flow cytometry (FCM). For cells, the SPRi and FCM signals for antigen exposure correlated (RHS578T cells2 = 0.66, RMCF7 cells2 = 0.78, RSKBR3 cells2 = 0.60). With regard to EVs, SPRi detected 31 out of 33 tested antibody-EV pairs, whereas our flow cytometer detected 5 antibody-EV pairs because of high blank and isotype control signals. For HS578T-derived EVs, the SPRi and FCM signals correlated (R2HS578T EVs = 0.98). However, on MCF7- and SKBR3-derived EVs, insufficient antigens were detected by our flow cytometer. To confirm that the SPRi responses correlated with mean antigen density on EVs, the SPRi responses of EVs were correlated with antigen density on parental cells as measured by FCM (RHS578T2 = 0.77, RMCF72 = 0.49, RSKBR32 = 0.52). SPRi responses correlate with mean antigen density. Moreover, SPRi detects lower antigen-exposure levels than FCM because SPRi measures an ensemble of EVs binding to the sensor surface, whereas FCM detects antigens of single EV. © 2017 American Association for Clinical Chemistry.
Analysis of the silicone polymer surface aging profile with laser-induced breakdown spectroscopy
Wang, Xilin; Hong, Xiao; Wang, Han; Chen, Can; Zhao, Chenlong; Jia, Zhidong; Wang, Liming; Zou, Lin
2017-10-01
Silicone rubber composite materials have been widely used in high voltage transmission lines for anti-pollution flashover. The aging surface of silicone rubber materials decreases service properties, causing loss of the anti-pollution ability. In this paper, as an analysis method requiring no sample preparation that is able to be conducted on site and suitable for nearly all types of materials, laser-induced breakdown spectroscopy (LIBS) was used for the analysis of newly prepared and aging (out of service) silicone rubber composites. With scanning electron microscopy (SEM) and hydrophobicity test, LIBS was proven to be nearly non-destructive for silicone rubber. Under the same LIBS testing parameters, a linear relationship was observed between ablation depth and laser pulses number. With the emission spectra, all types of elements and their distribution in samples along the depth direction from the surface to the inner part were acquired and verified with EDS results. This research showed that LIBS was suitable to detect the aging layer depth and element distribution of the silicone rubber surface.
Proteomic profiling of the surface-exposed cell envelope proteins of Caulobacter crescentus.
Cao, Yuan; Bazemore-Walker, Carthene R
2014-01-31
Biotinylation of intact cells, avidin enrichment of derivatized peptides, and shotgun proteomics were employed to reveal the composition of the surface-exposed proteome of the aquatic bacterium, Caulobacter crescentus. Ninety-one unique proteins were identified with the majority originating from the outer membrane, periplasm, and inner membrane, subcellular regions that comprise the Gram-negative bacterium cell envelope. Many of these proteins were described as 'conserved hypothetical protein' or 'hypothetical protein'; and so, the actual expression of these gene products was confirmed. Others did not have any known function or lacked annotation. However, this investigation of the Caulobacter surfaceome did reveal the unanticipated presence of a number of enzymes involved in protein degradation. The results presented here can provide a starting point for hypothesis-driven research projects focused on this bacterium in particular and centered on understanding Gram-negative cell architecture and outer membrane biogenesis broadly. The detected protein degradation enzymes anchored on or located within the outer membrane suggest that Caulobacter has nutrient sources larger than small molecules and/or further processes surface proteins once secreted to this location. Additionally, confirmation of outer membrane residency of those proteins predicted to be periplasmic or whose location prediction was not definitive could potentially elucidate the identities of Gram-negative specific anchorless surface proteins. This article is part of a Special Issue entitled: Trends in Microbial Proteomics. © 2013.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Rachwał, Kamila; Boguszewska, Aleksandra; Kopcińska, Joanna; Karaś, Magdalena; Tchórzewski, Marek; Janczarek, Monika
2016-01-01
Rhizobium leguminosarum bv. trifolii is capable of establishing a symbiotic relationship with plants from the genus Trifolium. Previously, a regulatory protein encoded by rosR was identified and characterized in this bacterium. RosR possesses a Cys2-His2-type zinc finger motif and belongs to Ros/MucR family of rhizobial transcriptional regulators. Transcriptome profiling of the rosR mutant revealed a role of this protein in several cellular processes, including the synthesis of cell-surface components and polysaccharides, motility, and bacterial metabolism. Here, we show that a mutation in rosR resulted in considerable changes in R. leguminosarum bv. trifolii protein profiles. Extracellular, membrane, and periplasmic protein profiles of R. leguminosarum bv. trifolii wild type and the rosR mutant were examined, and proteins with substantially different abundances between these strains were identified. Compared with the wild type, extracellular fraction of the rosR mutant contained greater amounts of several proteins, including Ca(2+)-binding cadherin-like proteins, a RTX-like protein, autoaggregation protein RapA1, and flagellins FlaA and FlaB. In contrast, several proteins involved in the uptake of various substrates were less abundant in the mutant strain (DppA, BraC, and SfuA). In addition, differences were observed in membrane proteins of the mutant and wild-type strains, which mainly concerned various transport system components. Using atomic force microscopy (AFM) imaging, we characterized the topography and surface properties of the rosR mutant and wild-type cells. We found that the mutation in rosR gene also affected surface properties of R. leguminosarum bv. trifolii. The mutant cells were significantly more hydrophobic than the wild-type cells, and their outer membrane was three times more permeable to the hydrophobic dye N-phenyl-1-naphthylamine. The mutation of rosR also caused defects in bacterial symbiotic interaction with clover plants. Compared with
Directory of Open Access Journals (Sweden)
Kamila Rachwał
2016-08-01
Full Text Available Rhizobium leguminosarum bv. trifolii is capable of establishing a symbiotic relationship with plants from the genus Trifolium. Previously, a regulatory protein encoded by rosR was identified and characterized in this bacterium. RosR possesses a Cys2-His2-type zinc finger motif and belongs to Ros/MucR family of rhizobial transcriptional regulators. Transcriptome profiling of the rosR mutant revealed a role of this protein in several cellular processes, including the synthesis of cell-surface components and polysaccharides, motility, and bacterial metabolism. Here, we show that a mutation in rosR resulted in considerable changes in R. leguminosarum bv. trifolii protein profiles. Extracellular, membrane, and periplasmic protein profiles of R. leguminosarum bv. trifolii wild type and the rosR mutant were examined, and proteins with substantially different abundances between these strains were identified. Compared with the wild type, extracellular fraction of the rosR mutant contained greater amounts of several proteins, including Ca2+-binding cadherin-like proteins, a RTX-like protein, autoaggregation protein RapA1, and flagellins FlaA and FlaB. In contrast, several proteins involved in the uptake of various substrates were less abundant in the mutant strain (DppA, BraC, and SfuA. In addition, differences were observed in membrane proteins of the mutant and wild-type strains, which mainly concerned various transport system components. Using atomic force microscopy imaging, we characterized the topography and surface properties of the rosR mutant and wild-type cells. We found that the mutation in rosR gene also affected surface properties of R. leguminosarum bv. trifolii. The mutant cells were significantly more hydrophobic than the wild-type cells, and their outer membrane was three times more permeable to the hydrophobic dye N-phenyl-1-naphthylamine. The mutation of rosR also caused defects in bacterial symbiotic interaction with clover plants
Profiling wrist pulse from skin surface by Advanced Vibrometer Interferometer Device
Lee, Hao-Xiang; Lee, Shu-Sheng; Hsu, Yu-Hsiang; Lee, Chih-Kung
2017-02-01
With global trends in population aging, the need to decrease and prevent the onset of cardiovascular disease has drawn a great attention. The traditional cuff-based upper arm sphygmomanometer is still the standard method to retrieve blood pressure information for diagnostics. However, this method is not easy to be adapted by patients and is not comfortable enough to perform a long term monitoring process. In order to correlate the beating profile of the arterial pulse on the wrist skin, an Advanced Vibrometer Interferometer Device (AVID) is adopted in this study to measure the vibration amplitude of skin and compare it with blood pressure measured from the upper arm. The AVID system can measure vibration and remove the directional ambiguity by using circular polarization interferometer technique with two orthogonal polarized light beams. The displacement resolution of the system is nearly 1.0 nm and the accuracy is experimentally verified. Using an optical method to quantify wrist pule, it provides a means to perform cuff-less, noninvasive and continuous measurement. In this paper, the correlations between the amplitude of skin vibration and the actual blood pressure is studied. The success of this method could potentially set the foundation of blood pressure monitor system based on optical approaches.
Toolan, Daniel T W; Barker, Robert; Gough, Tim; Topham, Paul D; Howse, Jonathan R; Glidle, Andrew
2017-02-01
A new approach is described herein, where neutron reflectivity measurements that probe changes in the density profile of thin films as they absorb material from the gas phase have been combined with a Love wave based gravimetric assay that measures the mass of absorbed material. This combination of techniques not only determines the spatial distribution of absorbed molecules, but also reveals the amount of void space within the thin film (a quantity that can be difficult to assess using neutron reflectivity measurements alone). The uptake of organic solvent vapours into spun cast films of polystyrene has been used as a model system with a view to this method having the potential for extension to the study of other systems. These could include, for example, humidity sensors, hydrogel swelling, biomolecule adsorption or transformations of electroactive and chemically reactive thin films. This is the first ever demonstration of combined neutron reflectivity and Love wave-based gravimetry and the experimental caveats, limitations and scope of the method are explored and discussed in detail. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Birabrata Nayak
2016-10-01
Full Text Available In the present research, a meaty textured soybean was prepared by solid-state fermentation using Rhizopus oligosporus and dried Agaricus mushroom. The textural profile of the fermented soybean was optimized, modelled and validated by comparing the product with poultry meat. Under the optimum condition; thickness of solid substrate, inoculums volume and quantity of Agaricus mushroom powder were measured to be 1.12 cm, 5.92% (v/w and 4.84 % (w/w, respectively. The final product is found to possess hardness 538.11 g, cohesiveness 0.41, springiness 0.39, gumminess 314.85 g, chewiness 79.43 g and resilience 0.45. There is an increase in absorbable isoflavone (daidzein and antioxidant activity with lower carbohydrate and saturated fat content due to fermentation of soybean with R. oligosporus. The developed product possesses good nutritional (17.4% protein and 15.12% total fiber and functional (3.9 g/100 g diadzein; antioxidant activity 3.9 mMTR quality with low calorific value of 212.10 kCal/100 g, and it can be considered as a good “meat analogue” having the nutritional and nutraceutical richness of fermented soybeans and mushroom.
Surface-seismic imaging for nehrp soil profile classifications and earthquake hazards in urban areas
Williams, R.A.; Stephenson, W.J.; Odum, J.K.
1998-01-01
We acquired high-resolution seismic-refraction data on the ground surface in selected areas of the San Fernando Valley (SFV) to help explain the earthquake damage patterns and the variation in ground motion caused by the 17 January 1994 magnitude 6.7 Northridge earthquake. We used these data to determine the compressional- and shear-wave velocities (Vp and Vs) at 20 aftershock recording sites to 30-m depth ( V??s30, and V??p30). Two other sites, located next to boreholes with downhole Vp and Vs data, show that we imaged very similar seismic-vefocity structures in the upper 40 m. Overall, high site response appears to be associated with tow Vs in the near surface, but there can be a wide rangepf site amplifications for a given NEHRP soil type. The data suggest that for the SFV, if the V??s30 is known, we can determine whether the earthquake ground motion will be amplified above a factor of 2 relative to a local rock site.
Directory of Open Access Journals (Sweden)
Ming-Hui Yang
2016-01-01
Full Text Available The microenvironment of neuron cells plays a crucial role in regulating neural development and regeneration. Hyaluronic acid (HA biomaterial has been applied in a wide range of medical and biological fields and plays important roles in neural regeneration. PC12 cells have been reported to be capable of endogenous NGF synthesis and secretion. The purpose of this research was to assess the effect of HA biomaterial combining with PC12 cells conditioned media (PC12 CM in neural regeneration. Using SH-SY5Y cells as an experimental model, we found that supporting with PC12 CM enhanced HA function in SH-SY5Y cell proliferation and adhesion. Through RP-nano-UPLC-ESI-MS/MS analyses, we identified increased expression of HSP60 and RanBP2 in SH-SY5Y cells grown on HA-modified surface with cotreatment of PC12 CM. Moreover, we also identified factors that were secreted from PC12 cells and may promote SH-SY5Y cell proliferation and adhesion. Here, we proposed a biomaterial surface enriched with neurotrophic factors for nerve regeneration application.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Lectin binding profiles of SSEA-4 enriched, pluripotent human embryonic stem cell surfaces
Directory of Open Access Journals (Sweden)
Shin Soojung
2005-07-01
Full Text Available Abstract Background Pluripotent human embryonic stem cells (hESCs have the potential to form every cell type in the body. These cells must be appropriately characterized prior to differentiation studies or when defining characteristics of the pluripotent state. Some developmentally regulated cell surface antigens identified by monoclonal antibodies in a variety of species and stem cell types have proven to be side chains of membrane glycolipids and glycoproteins. Therefore, to examine hESC surfaces for other potential pluripotent markers, we used a panel of 14 lectins, which were chosen based on their specificity for a variety of carbohydrates and carbohydrate linkages, along with stage specific embryonic antigen-4 (SSEA-4, to determine binding quantitation by flow cytometry and binding localization in adherent colonies by immunocytochemistry. Results Enriching cells for SSEA-4 expression increased the percentage of SSEA-4 positive cells to 98–99%. Using enriched high SSEA-4-expressing hESCs, we then analyzed the binding percentages of selected lectins and found a large variation in binding percentages ranging from 4% to 99% binding. Lycopersicon (tomatoesculetum lectin (TL, Ricinus communis agglutinin (RCA, and Concanavalin A (Con A bound to SSEA-4 positive regions of hESCs and with similar binding percentages as SSEA-4. In contrast, we found Dolichos biflorus agglutinin (DBA and Lotus tetragonolobus lectin (LTL did not bind to hESCs while Phaseolus vulgaris leuco-agglutinin (PHA-L, Vicia villosa agglutinin (VVA, Ulex europaeus agglutinin (UEA, Phaseolus vulgaris erythro-agglutinin (PHA-E, and Maackia amurensis agglutinin (MAA bound partially to hESCs. These binding percentages correlated well with immunocytochemistry results. Conclusion Our results provide information about types of carbohydrates and carbohydrate linkages found on pluripotent hESC surfaces. We propose that TL, RCA and Con A may be used as markers that are associated with the
Energy Technology Data Exchange (ETDEWEB)
Hyppoenen, M.; Walden, J.A.
1996-12-31
The design, construction and measurements of a computer controlled system applicable to flux measurements of a scalar quantity by the gradient technique are described. Accuracy requirements for the measured variables which are used for flux calculations are considered, together with some practical aspects concerning data storage and control. The construction includes the hardware and the data acquisition, sample intake, and temperature measurement systems. The measurements comprise laboratory tests of the temperature probes and the hardware as well as field tests over wheat and grass land for temperature and wind speed and ozone (O{sub 3}), carbon dioxide (CO{sub 2}) and nitrous oxide (N{sub 2}O) concentration profiles. The hardware takes care of most of the operation and only the necessary part is done by the software. The data acquisition system is flexible, accepting the input of either digital and/or analog signals. It also controls the whole system, storing all the data in a single data file. The sample intake unit is designed to take continuous samples in to the monitors as well as grab samples into the canisters. Samples can be selected from one to four levels with no dead volumes in the sampling tubes. The temperature measurement system is constructed using a pair of temperature probes, Pt-100, which are connected to the same signal processing card, in order to remove the offset of the electronic components as well as the bias associated with single probes. This ensures the accuracy of the probes down to 0.005 deg C. According to the field measurements, the relative error limits for the sensible heat fluxes varied from 7 to 20 % in an unstable atmospheric situation. For the ozone flux, the error limits varied from 20 to 100 %, indicating a much poorer accuracy of the monitor compared to the temperature probes. (orig.) 16 refs.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Directory of Open Access Journals (Sweden)
Kumar Sukhdeo
Full Text Available Colon cancer is a deadly disease affecting millions of people worldwide. Current treatment challenges include management of disease burden as well as improvements in detection and targeting of tumor cells. To identify disease state-specific surface antigen signatures, we combined fluorescent cell barcoding with high-throughput flow cytometric profiling of primary and metastatic colon cancer lines (SW480, SW620, and HCT116. Our multiplexed technique offers improvements over conventional methods by permitting the simultaneous and rapid screening of cancer cells with reduced effort and cost. The method uses a protein-level analysis with commercially available antibodies on live cells with intact epitopes to detect potential tumor-specific targets that can be further investigated for their clinical utility. Multiplexed antibody arrays can easily be applied to other tumor types or pathologies for discovery-based approaches to target identification.
Liu, Hongxia; Hu, Ying; Qi, Shihua; Xing, Xinli; Zhang, Yuan; Yang, Dan; Qu, Chengkai
2015-06-01
Organochlorine pesticides (OCPs) found in rivers from the Sichuan Basin to Aba Prefecture profile were analyzed to assess possible health risks to adults and children who use the river as a source of drinking water. OCP concentrations in surface water ranged between 22.29-274.28 ng·L-1. Compared with other published data around the world, OCP levels in this study were moderate. Among all OCPs, hexachlorobenzene (HCB) and hexachlorocyclohexanes (HCHs) were the predominant compounds. Higher concentrations of OCPs were attributed close to the agricultural fields of the Sichuan Basin, current OCPs inputs, and long-range atmospheric transport from abroad. Various spatial patterns of OCPs in the profile might be affected by the usage and physicochemical properties of the pesticides, in addition to the adjacent geographical environment. The health risk assessment indicated that most OCPs had little impact on human health according to the acceptable risk level for carcinogens (10-6) recommended by the US EPA. However, carcinogenic effects caused by heptachlor, Aldrin, HCB, and α-HCH might occur in drinking water. The risk of negative impacts caused by OCPs is much higher for children than for adults.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Energy Technology Data Exchange (ETDEWEB)
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Sun, Yingying; Yanagisawa, Masahiro; Kunimoto, Masahiro; Nakamura, Masatoshi; Homma, Takayuki
2017-09-01
The internal structure of self-assembled monolayers (SAMs) such as 3-aminopropyltriethoxysilane (APTES) fabricated on a glass substrate is difficult to characterize and analyze at nanometer level. In this study, we employed surface-enhanced Raman spectroscopy (SERS) to study the internal molecular structure of APTES SAMs. The sample APTES SAMs were deposited with Ag nanoparticles to enhance the Raman signal and to obtain subtler structure information, which were supported by density functional theory calculations. In addition, in order to carry out high-resolution analysis, especially for vertical direction, a fine piezo electric positioner was used to control the depth scanning with a step of 0.1 nm. We measured and distinguished the vertical Raman intensity variations of specific groups in APTES, such as Ag/NH2, CH2, and Sisbnd O, with high resolution. The interfacial bond at the two interfaces of Ag-APTES and APTES-SiO2 was identified. Moreover, APTES molecule orientation was demonstrated to be inhomogeneous from frequency shift.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Huygens' foundations of probability
Freudenthal, Hans
It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Probably Almost Bayes Decisions
DEFF Research Database (Denmark)
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Univariate Probability Distributions
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.
Probability Theory Without Tears!
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...
African Journals Online (AJOL)
Willem Scholtz
internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.
Indian Academy of Sciences (India)
important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.
A nonparametric method for predicting survival probabilities
van der Klaauw, B.; Vriend, S.
2015-01-01
Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Gong, Yuanzheng; Seibel, Eric J.
2017-01-01
Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.
Gong, Yuanzheng; Seibel, Eric J
2017-01-01
Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Tumour control probability in cancer stem cells hypothesis.
Directory of Open Access Journals (Sweden)
Andrew Dhawan
Full Text Available The tumour control probability (TCP is a formalism derived to compare various treatment regimens of radiation therapy, defined as the probability that given a prescribed dose of radiation, a tumour has been eradicated or controlled. In the traditional view of cancer, all cells share the ability to divide without limit and thus have the potential to generate a malignant tumour. However, an emerging notion is that only a sub-population of cells, the so-called cancer stem cells (CSCs, are responsible for the initiation and maintenance of the tumour. A key implication of the CSC hypothesis is that these cells must be eradicated to achieve cures, thus we define TCPS as the probability of eradicating CSCs for a given dose of radiation. A cell surface protein expression profile, such as CD44high/CD24low for breast cancer or CD133 for glioma, is often used as a biomarker to monitor CSCs enrichment. However, it is increasingly recognized that not all cells bearing this expression profile are necessarily CSCs, and in particular early generations of progenitor cells may share the same phenotype. Thus, due to the lack of a perfect biomarker for CSCs, we also define a novel measurable TCPCD+, that is the probability of eliminating or controlling biomarker positive cells. Based on these definitions, we use stochastic methods and numerical simulations parameterized for the case of gliomas, to compare the theoretical TCPS and the measurable TCPCD+. We also use the measurable TCP to compare the effect of various radiation protocols.
Giffard, Philip M; Su, Jiunn-Yih; Andersson, Patiyan; Holt, Deborah C
2017-01-01
The microbiome of built environment surfaces is impacted by the presence of humans. In this study, we tested the hypothesis that analysis of surface swabs from clinic toilet/bathroom yields results correlated with sexually transmitted infection (STI) notifications from corresponding human populations. We extended a previously reported study in which surfaces in toilet/bathroom facilities in primary health clinics in the Australian Northern Territory (NT) were swabbed then tested for nucleic acid from the STI agents Chlamydia trachomatis, Neisseria gonorrhoeae and Trichomonas vaginalis. This was in the context of assessing the potential for such nucleic acid to contaminate specimens collected in such facilities. STIs are notifiable in the NT, thus allowing comparison of swab and notification data. An assumption in the design was that while absolute built environment loads of STI nucleic acids will be a function of patient traffic density and facility cleaning protocols, the relative loads of STI nucleic acids from different species will be largely unaffected by these processes. Another assumption was that the proportion of swabs testing positive for STIs provides a measure of surface contamination. Accordingly, "STI profiles" were calculated. These were the proportions that each of the three STIs of interest contributed to the summed STI positive swabs or notifications. Three comparisons were performed, using swab data from clinics in remote Indigenous communities, clinics in small-medium towns, and a single urban sexual health clinic. These data were compared with time and place-matched STI notifications. There were significant correlations between swab and notifications data for the both the remote Indigenous and regional data. For the remote Indigenous clinics the p values ranged from 0.041 to 0.0089, depending on data transformation and p value inference method. Further, the swab data appeared to strongly indicate known higher relative prevalence of gonorrhoeae
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...
Waste Package Misload Probability
Energy Technology Data Exchange (ETDEWEB)
J.K. Knudsen
2001-11-20
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Measurement uncertainty and probability
National Research Council Canada - National Science Library
Willink, Robin
2013-01-01
... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...
Structural Minimax Probability Machine.
Gu, Bin; Sun, Xingming; Sheng, Victor S
2017-07-01
Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Vijeesh, V.; Prabhu, K. N.
2016-03-01
In the present investigation, Al-14 wt. % Si alloy was solidified against copper, brass and cast iron chills, to study the effect of Ce melt treatment on casting/chill interfacial heat flux transients and casting surface profile. The heat flux across the casting/chill interface was estimated using inverse modelling technique. On addition of 1.5% Ce, the peak heat flux increased by about 38%, 42% and 43% for copper, brass and cast iron chills respectively. The effect of Ce addition on casting surface texture was analyzed using a surface profilometer. The surface profile of the casting and the chill surfaces clearly indicated the formation of an air gap at the periphery of the casting. The arithmetic average value of the profile departure from the mean line (Ra) and arithmetical mean of the absolute departures of the waviness profile from the centre line (Wa) were found to decrease on Ce addition. The interfacial gap width formed for the unmodified and Ce treated casting surfaces at the periphery were found to be about 35µm and 13µm respectively. The enhancement in heat transfer on addition of Ce addition was attributed to the lowering of the surface tension of the liquid melt. The gap width at the interface was used to determine the variation of heat transfer coefficient (HTC) across the chill surface after the formation of stable solid shell. It was found that the HTC decreased along the radial direction for copper and brass chills and increased along radial direction for cast iron chills.
Yan, X. L.; Coetsee, E.; Wang, J. Y.; Swart, H. C.; Terblans, J. J.
2017-07-01
The polycrystalline Ni/Cu multilayer thin films consisting of 8 alternating layers of Ni and Cu were deposited on a SiO2 substrate by means of electron beam evaporation in a high vacuum. Concentration-depth profiles of the as-deposited multilayered Ni/Cu thin films were determined with Auger electron spectroscopy (AES) in combination with Ar+ ion sputtering, under various bombardment conditions with the samples been stationary as well as rotating in some cases. The Mixing-Roughness-Information depth (MRI) model used for the fittings of the concentration-depth profiles accounts for the interface broadening of the experimental depth profiling. The interface broadening incorporates the effects of atomic mixing, surface roughness and information depth of the Auger electrons. The roughness values extracted from the MRI model fitting of the depth profiling data agrees well with those measured by atomic force microscopy (AFM). The ion sputtering induced surface roughness during the depth profiling was accordingly quantitatively evaluated from the fitted MRI parameters with sample rotation and stationary conditions. The depth resolutions of the AES depth profiles were derived directly from the values determined by the fitting parameters in the MRI model.
Cockburn, Darrell W; Clarke, Anthony J
2011-05-01
One industrial process for the production of cellulosic ethanol and or value-added products involves exposing the cellulose content of plant materials by steam explosion in the presence of strong acid, followed by its neutralization and subsequent digestion with a cocktail of cellulolytic enzymes. These enzymes typically have activity optima at slightly acidic or neutral pH and so generating enzymes that are more active and tolerant in more acidic conditions would help to reduce associated costs. Here, we describe the engineering of cellulase A from Cellulomonas fimi as a model to replace residues that were identified as potentially influencing the pH-activity profile of the enzyme based on sequence alignments and analysis of the known three-dimensional structures of other CAZy family 6 glycoside hydrolases with the aim to lower its pH optimum. Twelve specific residues and a sequence of eight were identified and a total of 30 mutant enzymes were generated. In addition to being replaced with natural amino acids, some of the identified residues were substituted with cysteine and subsequently oxidized to cysteinesulfinate. Of the four single amino acid replacements that produced enhancements of activity at acidic pH, three involved the removal of charged groups from the surface of the enzyme. The generation of double mutations provided mixed results but the combination of Glu407 → Ala and Tyr321 → Phe replacements had an additive effect on the enhancement, reaching a total activity that was 162% of the wild-type level. This study thus illustrated the utility of altering the surface charge properties of the family 6 glycoside hydrolases to enhance activity at low pH and thereby an avenue for further protein engineering.
Riffel, R. A.; Storchi-Bergmann, T.; Riffel, R.; Davies, R.; Bianchin, M.; Diniz, M. R.; Schönell, A. J.; Burtscher, L.; Crenshaw, M.; Fischer, T. C.; Dahmer-Hahn, L. G.; Dametto, N. Z.; Rosario, D.
2018-02-01
We present and characterize a sample of 20 nearby Seyfert galaxies selected for having BAT 14-195 keV luminosities LX ≥ 1041.5 erg s-1, redshift z ≤ 0.015, being accessible for observations with the Gemini Near-Infrared Field Spectrograph (NIFS) and showing extended [O III]λ5007 emission. Our goal is to study Active Galactic Nucleus (AGN) feeding and feedback processes from near-infrared integral-field spectra, which include both ionized (H II) and hot molecular (H2) emission. This sample is complemented by other nine Seyfert galaxies previously observed with NIFS. We show that the host galaxy properties (absolute magnitudes MB, MH, central stellar velocity dispersion and axial ratio) show a similar distribution to those of the 69 BAT AGN. For the 20 galaxies already observed, we present surface mass density (Σ) profiles for H II and H2 in their inner ˜500 pc, showing that H II emission presents a steeper radial gradient than H2. This can be attributed to the different excitation mechanisms: ionization by AGN radiation for H II and heating by X-rays for H2. The mean surface mass densities are in the range (0.2 ≤ ΣH II ≤ 35.9) M⊙ pc-2, and (0.2 ≤ ΣH2 ≤ 13.9)× 10-3 M⊙ pc-2, while the ratios between the H II and H2 masses range between ˜200 and 8000. The sample presented here will be used in future papers to map AGN gas excitation and kinematics, providing a census of the mass inflow and outflow rates and power as well as their relation with the AGN luminosity.
Directory of Open Access Journals (Sweden)
Shinan Qian
2011-01-01
Full Text Available Nanoradian Surface Profilers (NSPs are required for state-of-the-art synchrotron radiation optics and high-precision optical measurements. Nano-radian accuracy must be maintained in the large-angle test range. However, the beams' notable lateral motions during tests of most operating profilers, combined with the insufficiencies of their optical components, generate significant errors of ∼1 μrad rms in the measurements. The solution to nano-radian accuracy for the new generation of surface profilers in this range is to apply a scanning optical head, combined with nontilted reference beam. I describe here my comparison of different scan modes and discuss some test results.
No tillage (NT) and N fertilization can increase surface soil organic C (SOC) stocks, but the effects deeper in the soil profile are uncertain. Subsequent tillage could counter SOC stabilized through NT practices by disrupting soil aggregation and promoting decomposition. We followed a long-term ti...
Long-Term Impact of Sediment Deposition and Erosion on Water Surface Profiles in the Ner River
Directory of Open Access Journals (Sweden)
Tomasz Dysarz
2017-02-01
Full Text Available The purpose of the paper is to test forecasting of the sediment transport process, taking into account two main uncertainties involved in sediment transport modeling. These are: the lack of knowledge regarding future flows, and the uncertainty with respect to which sediment transport formula should be chosen for simulations. The river reach chosen for study is the outlet part of the Ner River, located in the central part of Poland. The main characteristic of the river is the presence of an intensive morphodynamic process, increasing flooding frequency. The approach proposed here is based on simulations with a sediment-routing model and assessment of the hydraulic condition changes on the basis of hydrodynamic calculations for the chosen characteristic flows. The data used include Digital Terrain Models (DTMs, cross-section measurements, and hydrological observations from the Dabie gauge station. The sediment and hydrodynamic calculations are performed using program HEC-RAS 5.0. Twenty inflow scenarios are of a 10-year duration and are composed on the basis of historical data. Meyer-Peter and Müller and Engelund-Hansen formulae are applied for the calculation of sediment transport intensity. The methodology presented here seems to be a good tool for the prediction of long-term impacts on water surface profiles caused by sediment deposition and erosion.
Dutkiewicz, Ewelina P; Chiu, Hsien-Yi; Urban, Pawel L
2015-11-01
Micropatch-arrayed pads (MAPAs) are presented as a facile and sensitive sampling method for spatial profiling of topical agents adsorbed on the surface of skin. MAPAs are 28 × 28 mm sized pieces of polytetrafluoroethylene containing plurality of cavities filled with agarose hydrogel. They are affixed onto skin for 10 min with the purpose to collect drugs applied topically. Polar compounds are absorbed by the hydrogel micropatches. The probes are subsequently scanned by an automated nanospray desorption electrospray ionization mass spectrometry system operated in the tapping dual-polarity mode. When the liquid junction gets into contact with every micropatch, polar compounds absorbed in the hydrogel matrix are desorbed and transferred to the ion source. A 3D-printed interface prevents evaporation of hydrogel micropatches assuring good reproducibility and sensitivity. MAPAs have been applied to follow dispersion of topical drugs applied to human skin in vivo and to porcine skin ex vivo, in the form of self-adhesive patches. Spatiotemporal characteristics of the drug dispersion process have been revealed using this non-invasive test. Differences between drug dispersion in vivo and ex vivo could be observed. We envision that MAPAs can be used to investigate spatiotemporal kinetics of various topical agents utilized in medical treatment. Copyright © 2015 John Wiley & Sons, Ltd.
Energy Technology Data Exchange (ETDEWEB)
Yan, X.L.; Coetsee, E. [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa); Wang, J.Y., E-mail: wangjy@stu.edu.cn [Department of Physics, Shantou University, 243 Daxue Road, Shantou, 515063, Guangdong (China); Swart, H.C., E-mail: swartHC@ufs.ac.za [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa); Terblans, J.J., E-mail: terblansjj@ufs.ac.za [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa)
2017-07-31
Highlights: • Linear Least Square (LLS) method used to separate Ni and Cu Auger spectra. • The depth-dependent ion sputtering induced roughness was quantitatively evaluated. • The depth resolution better when profiling with dual-ion beam vs. a single-ion beam. • AES depth profiling with a lower ion energy results in a better depth resolution. - Abstract: The polycrystalline Ni/Cu multilayer thin films consisting of 8 alternating layers of Ni and Cu were deposited on a SiO{sub 2} substrate by means of electron beam evaporation in a high vacuum. Concentration-depth profiles of the as-deposited multilayered Ni/Cu thin films were determined with Auger electron spectroscopy (AES) in combination with Ar{sup +} ion sputtering, under various bombardment conditions with the samples been stationary as well as rotating in some cases. The Mixing-Roughness-Information depth (MRI) model used for the fittings of the concentration-depth profiles accounts for the interface broadening of the experimental depth profiling. The interface broadening incorporates the effects of atomic mixing, surface roughness and information depth of the Auger electrons. The roughness values extracted from the MRI model fitting of the depth profiling data agrees well with those measured by atomic force microscopy (AFM). The ion sputtering induced surface roughness during the depth profiling was accordingly quantitatively evaluated from the fitted MRI parameters with sample rotation and stationary conditions. The depth resolutions of the AES depth profiles were derived directly from the values determined by the fitting parameters in the MRI model.
de Lima, O. A.; Pereira, P. D.
2007-05-01
During the last three years we are developing hydrobiogeological researches to quantitatively describe the underground contamination of a 4.0 km2 area, including two landfill deposits and a tannery industry of Alagoinhas city, Bahia state, Brazil. We used electrical geophysics, geological, geochemical and biological analysis to gain a general understanding of the complex interactions between organic and inorganic pollutants and their environmental impacts. A geological reconnaissance work and a geoelectrical survey using vertical electrical soundings were made around the area to detect and to delineate the extent of the underground contamination plume. The results pointed out the presence of a strong conductive anomaly within the aquifer resulting from invasive fluids both from the landfills and from the surface disposal lagoons from the tannery. Water samples collected at available wells and along the Sauipe river, have shown drastic changes in the total dissolved solids, total chromium, inorganic macro-components, biochemical oxygen demand, chemical oxygen demand, nutrients and bacterial content. As a complimentary work, apparent resistivity and chargeability data were measured as a function of depth along three new multi-electrode wells, and as a function of electrode spacing along five double semi-Schlumberger subsurface profiles. A multi-electrode well is a special monitoring well where we externally install copper electrodes as thin metallic rings spaced by 0.50 m, along its entire filter and casing length. Such electrodes are connected through insulated cables to the ground surface and may be combined into different arrays. Two-side semi-Schlumberger soundings expanded up to 200 m AB/2 spacing and with centers spaced by 50 m along special transverse centered at the plume were inverted using 1D and 2D models. Both techniques were used to detail the groundwater contamination around the Alagoinhas landfills. The electrical measurements performed at the earth
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Directory of Open Access Journals (Sweden)
J. M. MUNUERA
1964-06-01
Full Text Available The material included in former two papers (SB and EF
which summs 3307 shocks corresponding to 2360 years, up to I960, was
reduced to a 50 years period by means the weight obtained for each epoch.
The weitliing factor is the ratio 50 and the amount of years for every epoch.
The frequency has been referred over basis VII of the international
seismic scale of intensity, for all cases in which the earthquakes are equal or
greater than VI and up to IX. The sum of products: frequency and parameters
previously exposed, is the probable frequency expected for the 50
years period.
On each active small square, we have made the corresponding computation
and so we have drawn the Map No 1, in percentage. The epicenters with
intensity since X to XI are plotted in the Map No 2, in order to present a
complementary information.
A table shows the return periods obtained for all data (VII to XI,
and after checking them with other computed from the first up to last shock,
a list includes the probable approximate return periods estimated for the area.
The solution, we suggest, is an appropriated form to express the seismic
contingent phenomenon and it improves the conventional maps showing
the equal intensity curves corresponding to the maximal values of given side.
Canestri, Franco
2002-12-01
This paper describes five cases of macroscopic irregular CO(2) laser-beam ablation patterns that can generate below-surface complications during surgery. These five cases are related to curved reflected beams, curved craters generation with abnormal superficial thermal damage, and craters that show irregular wall contours. Although these alterations have been observed during irradiation in PMMA samples (polymethilmethacrylate), it is possible that similar unpredictable changes also happen in low-water-content, hard and uniform biological tissues such as compact bone, enamel, and dentin. This fact can predict severe impacts on the quality of the final surgical outcome, especially there where precision surgery techniques are required. A qualitative description about the possible causes of these effects and how to avoid them during surgery have been suggested too. In the past decades, daily surgery and research studies have provided useful information about the interaction between medical CO(2) laser beams and animal, human, and other biological tissues. Several mathematical models describe with acceptable accuracy all the ablative properties of the 10.6 microm laser beam. Very few studies describe the presence and address the consequences of the ablative aberrations, which can frequently and randomly happen during laser surgery. The probability that these changes happen in below-surface, therefore invisible, parts of the biologic media under treatment makes the whole matter crucial, even in cases of traditional surgery. Where gross mass removals are considered, the presence of unpredictable and sudden deviations from the expected traditional cone-shaped patterns raise several questions about safety. The continuous need for properly engineered medical laser-beam devices, online laser-beam monitoring, and real-time control becomes mandatory in modern surgery. The equipment used in this study was provided by the National Cancer Institute of Milan, Milan, Italy, and by
Noone, D.; Risi, C.; Bailey, A.; Berkelhammer, M.; Brown, D. P.; Buenning, N.; Gregory, S.; Nusbaumer, J.; Schneider, D.; Sykes, J.; Vanderwende, B.; Wong, J.; Meillier, Y.; Wolfe, D.
2013-02-01
The D/H isotope ratio is used to attribute boundary layer humidity changes to the set of contributing fluxes for a case following a snowstorm in which a snow pack of about 10 cm vanished. Profiles of H2O and CO2 mixing ratio, D/H isotope ratio, and several thermodynamic properties were measured from the surface to 300 m every 15 min during four winter days near Boulder, Colorado. Coeval analysis of the D/H ratios and CO2 concentrations find these two variables to be complementary with the former being sensitive to daytime surface fluxes and the latter particularly indicative of nocturnal surface sources. Together they capture evidence for strong vertical mixing during the day, weaker mixing by turbulent bursts and low level jets within the nocturnal stable boundary layer during the night, and frost formation in the morning. The profiles are generally not well described with a gradient mixing line analysis because D/H ratios of the end members (i.e., surface fluxes and the free troposphere) evolve throughout the day which leads to large uncertainties in the estimate of the D/H ratio of surface water flux. A mass balance model is constructed for the snow pack, and constrained with observations to provide an optimal estimate of the partitioning of the surface water flux into contributions from sublimation, evaporation of melt water in the snow and evaporation from ponds. Results show that while vapor measurements are important in constraining surface fluxes, measurements of the source reservoirs (soil water, snow pack and standing liquid) offer stronger constraint on the surface water balance. Measurements of surface water are therefore essential in developing observational programs that seek to use isotopic data for flux attribution.
Directory of Open Access Journals (Sweden)
D. Noone
2013-02-01
Full Text Available The D/H isotope ratio is used to attribute boundary layer humidity changes to the set of contributing fluxes for a case following a snowstorm in which a snow pack of about 10 cm vanished. Profiles of H_{2}O and CO_{2} mixing ratio, D/H isotope ratio, and several thermodynamic properties were measured from the surface to 300 m every 15 min during four winter days near Boulder, Colorado. Coeval analysis of the D/H ratios and CO_{2} concentrations find these two variables to be complementary with the former being sensitive to daytime surface fluxes and the latter particularly indicative of nocturnal surface sources. Together they capture evidence for strong vertical mixing during the day, weaker mixing by turbulent bursts and low level jets within the nocturnal stable boundary layer during the night, and frost formation in the morning. The profiles are generally not well described with a gradient mixing line analysis because D/H ratios of the end members (i.e., surface fluxes and the free troposphere evolve throughout the day which leads to large uncertainties in the estimate of the D/H ratio of surface water flux. A mass balance model is constructed for the snow pack, and constrained with observations to provide an optimal estimate of the partitioning of the surface water flux into contributions from sublimation, evaporation of melt water in the snow and evaporation from ponds. Results show that while vapor measurements are important in constraining surface fluxes, measurements of the source reservoirs (soil water, snow pack and standing liquid offer stronger constraint on the surface water balance. Measurements of surface water are therefore essential in developing observational programs that seek to use isotopic data for flux attribution.
McDonald, Michael; Courteau, Stéphane; Tully, R. Brent; Roediger, Joel
2011-07-01
We present and g-, r-, i-, z- and H-band surface brightness profiles and bulge-disc decompositions for a morphologically broad sample of 286 Virgo Cluster Catalogue (VCC) galaxies. The H-band data come from a variety of sources including our survey of 171 VCC galaxies at the University of Hawaii (UH) 2.2-m telescope, Canada-France-Hawaii Telescope (CFHT) and United Kingdom Infrared Telescope (UKIRT), and another 115 galaxies from the Two Micron All Sky Survey (2MASS) and GOLDMine archives. The optical data for all 286 VCC galaxies were extracted from the Sloan Digital Sky Survey (SDSS) images. The H-band and the SDSS griz data were analysed in a homogeneous manner using our own software, yielding a consistent set of deep, multiband surface brightness profiles for each galaxy. Average surface brightness profiles per morphological bin were created in order to characterize the variety of galaxy light profiles across the Hubble sequence. The 1D bulge-disc decomposition parameters, as well as non-parametric galaxy measures, such as effective radius, effective surface brightness and light concentration, are presented for all 286 VCC galaxies in each of the five optical/near-infrared wavebands. The profile decompositions account for bulge and disc components, spiral arms, nucleus and atmospheric blurring. The Virgo spiral galaxy bulges typically have a Sérsic index n˜ 1, while elliptical galaxies prefer n˜ 2. No galaxy spheroid requires n > 3. The light profiles for 70 per cent of the Virgo elliptical galaxies reveal the presence of both a spheroid and disc component. A more in-depth discussion of the structural parameter trends can be found in McDonald, Courteau & Tully. The data provided here should serve as a base for studies of galaxy structure and stellar populations in the cluster environment. The galaxy light profiles and bulge-disc decomposition results are available at the Centre de Données astronomiques de Strasbourg (CDS; ) and the author's own website ().
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Haghighi Erfan; Or Dani
2015-01-01
The Monin–Obukhov similarity theory (MOST) provides the theoretical basis for many “atmospheric based” methods (such as eddy covariance and flux profile methods) that are widely used for quantifying surface–atmosphere exchange processes. The turbulence driven and highly nonlinear profiles of momentum air temperature and vapor densities require complex resistance expressions applied to simple gradients deduced from a single or few height measurements. Notwithstanding the success of these atmos...
Balch, W. M.; Drapeau, D.; Bowler, B.; Lyczkowski, E.; Lubelczyk, L.
2016-02-01
We have participated in a number of cruises throughout the world ocean, observing the vertical and horizontal distributions of coccolithophores, their particulate inorganic carbon (PIC) and associated optical properties. We will provide a synthesis of our observations in support of the NASA ocean color algorithm for PIC, highlighting how the integrated concentration of these plants can be interpreted from surface satellite measurements. Our work has shown consistencies in the vertical distributions of coccolithophores that allow us to extrapolate surface PIC observations (from the top optical depth observed by satellite) to the integrated euphotic zone on depth scales of 100m. Such results are a function of the degree of eutrophy and are critical for understanding the global consequences of this phytoplankton functional group, their associated biogeochemistry and implications to the alkalinity pump. We will end by showing whether the vertical distributions of PIC differ from those of diatom biogenic silica.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Hughes, Adam P; Thiele, Uwe; Archer, Andrew J
2015-02-21
The contribution to the free energy for a film of liquid of thickness h on a solid surface due to the interactions between the solid-liquid and liquid-gas interfaces is given by the binding potential, g(h). The precise form of g(h) determines whether or not the liquid wets the surface. Note that differentiating g(h) gives the Derjaguin or disjoining pressure. We develop a microscopic density functional theory (DFT) based method for calculating g(h), allowing us to relate the form of g(h) to the nature of the molecular interactions in the system. We present results based on using a simple lattice gas model, to demonstrate the procedure. In order to describe the static and dynamic behaviour of non-uniform liquid films and drops on surfaces, a mesoscopic free energy based on g(h) is often used. We calculate such equilibrium film height profiles and also directly calculate using DFT the corresponding density profiles for liquid drops on surfaces. Comparing quantities such as the contact angle and also the shape of the drops, we find good agreement between the two methods. We also study in detail the effect on g(h) of truncating the range of the dispersion forces, both those between the fluid molecules and those between the fluid and wall. We find that truncating can have a significant effect on g(h) and the associated wetting behaviour of the fluid.
Ghan, Stephen J.; Rissman, Tracey A.; Ellman, Robert; Ferrare, Richard A.; Turner, David; Flynn, Connor; Wang, Jian; Ogren, John; Hudson, James; Jonsson, Haflidi H.;
2006-01-01
If the aerosol composition and size distribution below cloud are uniform, the vertical profile of cloud condensation nuclei (CCN) concentration can be retrieved entirely from surface measurements of CCN concentration and particle humidification function and surface-based retrievals of relative humidity and aerosol extinction or backscatter. This provides the potential for long-term measurements of CCN concentrations near cloud base. We have used a combination of aircraft, surface in situ, and surface remote sensing measurements to test various aspects of the retrieval scheme. Our analysis leads us to the following conclusions. The retrieval works better for supersaturations of 0.1% than for 1% because CCN concentrations at 0.1% are controlled by the same particles that control extinction and backscatter. If in situ measurements of extinction are used, the retrieval explains a majority of the CCN variance at high supersaturation for at least two and perhaps five of the eight flights examined. The retrieval of the vertical profile of the humidification factor is not the major limitation of the CCN retrieval scheme. Vertical structure in the aerosol size distribution and composition is the dominant source of error in the CCN retrieval, but this vertical structure is difficult to measure from remote sensing at visible wavelengths.
The profile of potential organ and tissue donors El perfil de probables donadores de órganos y tejidos O perfil de potenciais doadores de órgãos e tecidos
Directory of Open Access Journals (Sweden)
Edvaldo Leal de Moraes
2009-10-01
Full Text Available This study aimed to characterize donors according to gender, age group, cause of brain death; quantify donors with hypernatremia, hyperpotassemia and hypopotassemia; and get to know which organs were the most used in transplantations. This quantitative, descriptive, exploratory and retrospective study was performed at the Organ Procurement Organization of the University of São Paulo Medical School Hospital das Clínicas. Data from the medical records of 187 potential donors were analyzed. Cerebrovascular accidents represented 53.48% of all brain death causes, sodium and potassium disorders occurred in 82.36% of cases and 45.46% of the potential donors were between 41 and 60 years old. The results evidenced that natural death causes exceeded traumatic deaths, and that most donors presented sodium and potassium alterations, likely associated to inappropriate maintenance.Se tuvo como objetivos determinar las características de los donadores según el sexo, el intervalo de edad, y, las causas por muerte encefálica; determinar el número donadores que presentaban hipernatremia, hiperpotasemia y hipopotasemia; conocer los órganos que fueron más utilizados para el trasplante. Es un estudio de tipo cuantitativo, descriptivo, exploratorio y retrospectivo. La investigación fue realizada en una Institución de donación de Órganos perteneciente al Hospital de las Clínicas de Sao Paulo. Fueron analizados los datos de 187 probables donadores. Entre las causas de muerte encefálica el 53,48% fueron por accidente cerebro vascular, en 82,36% de los casos se produjeron alteraciones en los valores de sodio y potasio y los donadores se encontraban entre 41 y 60 años de edad. Los resultados muestran que las causas naturales de muerte superaron a las muertes por traumatismo. La mayoría de los donadores tuvo alteraciones en los niveles de sodio y potasio, estando posiblemente relacionadas a medidas de conservación inadecuadas.Objetivou-se caracterizar os
Dunning, Peter David
. Implementation of this technique requires that the colloidal droplet be separated from the active electrode by a dielectric layer to prevent electrolysis. A variety of polymer layers have been used in EWOD devices for a variety of applications. In applications that involve desiccation of colloidal suspensions, the material for this layer should be chosen carefully as it can play an important role in the resulting deposition pattern. An experimental method to monitor the transient evolution of the shape of an evaporating colloidal droplet and optically quantify the resultant deposition pattern is presented. Unactuated colloidal suspensions will be desiccated on a variety of substrates commonly used in EWOD applications. Transient image profiles and particle deposition patterns are examined for droplets containing fluorescent micro-particles. Qualitative and quantitative comparisons of these results will be used to compare multiple different cases in an effort to provide insight into the effects of polymer selection on the drying dynamics and resultant deposition patterns of desiccated colloidal materials. It was found that the equilibrium and receding contact angles between the surface and the droplet play a key role in the evaporation dynamics and the resulting deposition patterns left by a desiccated colloidal suspension. The equilibrium contact angle controls the initial contact diameter for a droplet of a given volume. As a droplet on a surface evaporates, the evolution of the interface shape and the contact diameter can generally be described by three different regimes. The Constant Contact Radius (CCR) regime occurs when the contact line is pinned while the contact angle decreases. The Constant Contact Angle (CCA) regime occurs when the contact line recedes while the contact angle remains constant. The Mixed regime occurs when the contact radius and angle both reduce over time. The presence of the CCA regime allows the contact line to recede creating a more uniform
Liquefaction Probability Curves for Surficial Geologic Units
Holzer, T. L.; Noce, T. E.; Bennett, M. J.
2009-12-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Gittens, Rolando A.; Olivares-Navarrete, Rene; Hyzy, Sharon L.; Sandhage, Kenneth H.; Schwartz, Zvi; Boyan, Barbara D.
2014-01-01
Recent studies of new surface modifications that superimpose well-defined nanostructures on microrough implants, thereby mimicking the hierarchical complexity of native bone, report synergistically enhanced osteoblast maturation and local factor production at the protein level compared to growth on surfaces that are smooth, nanorough, or microrough. Whether the complex micro/nanorough surfaces enhance the osteogenic response by triggering similar patterns of integrin receptors and their associated signaling pathways as with well-established microrough surfaces, is not well understood. Human osteoblasts (hOBs) were cultured until confluent for gene expression studies on tissue culture polystyrene (TCPS) or on titanium alloy (Ti6Al4V) disks with different surface topographies: smooth, nanorough, microrough, and micro/nanorough surfaces. mRNA expression of osteogenesis-related markers such as osteocalcin (BGLAP) and bone sialoprotein (BSP), bone morphogenetic protein 2 (BMP2), BMP4, noggin (NOG) and gremlin 1 (GREM1) were all higher on microrough and micro/nanorough surfaces, with few differences between them, compared to smooth and nanorough groups. Interestingly, expression of integrins α1 and β2, which interact primarily with collagens and laminin and have been commonly associated with osteoblast differentiation on microrough Ti and Ti6Al4V, were expressed at lower levels on micro/nanorough surfaces compared to microrough ones. Conversely, the av subunit, which binds ligands such as vitronectin, osteopontin, and bone sialoprotein among others, had higher expression on micro/nanorough surfaces concomitantly with regulation of the β3 mRNA levels on nanomodified surfaces. These results suggest that the maturation of osteoblasts on micro/nanorough surfaces may be occurring through different integrin engagement than those established for microrough-only surfaces. PMID:25158204
IceBridge Merged Photon Counting Lidar/Profiler L4 Surface Slope and Elevations V001
National Aeronautics and Space Administration — This data set contains geolocated surface elevation measurements captured over Antarctica using the Sigma Space Mapping Photon Counting Lidar and Riegl Laser...
Probability and statistics: selected problems
Machado, J.A. Tenreiro; Pinto, Carla M. A.
2014-01-01
Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.
Assessing the project efficiency of irrigation systems and water use efficiency of crops over large irrigated areas requires daily or seasonal evapotranspiration (ET) maps. Mapping ET or latent heat flux (LE) can be achieved spatially for land surfaces using remote sensing inputs such as surface ref...
Failure probability of regional flood defences
Directory of Open Access Journals (Sweden)
Lendering Kasper
2016-01-01
Full Text Available Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This paper proposed a methodology to quantify the probability of flooding of regional flood defence systems, which required several additions to the methodology used for the primary flood defence system. These additions focused on a method to account for regulation of regional water levels, the possibility of (reduced intrusion resistance due to maintenance dredging in regional water, the probability of traffic loads and the influence of dependence between regional water levels and the phreatic surface of a regional flood defence. In addition, reliability updating is used to demonstrate the potential for updating the probability of failure of regional flood defences with performance observations. The results demonstrated that the proposed methodology can be used to determine the probability of flooding of a regional flood defence system. In doing so, the methodology contributes to improving flood risk management in these systems.
Directory of Open Access Journals (Sweden)
Hemant K Daima
Full Text Available Antimicrobial action of nanomaterials is typically assigned to the nanomaterial composition, size and/or shape, whereas influence of complex corona stabilizing the nanoparticle surface is often neglected. We demonstrate sequential surface functionalization of tyrosine-reduced gold nanoparticles (AuNPs(Tyr with polyoxometalates (POMs and lysine to explore controlled chemical functionality-driven antimicrobial activity. Our investigations reveal that highly biocompatible gold nanoparticles can be tuned to be a strong antibacterial agent by fine-tuning their surface properties in a controllable manner. The observation from the antimicrobial studies on a gram negative bacterium Escherichia coli were further validated by investigating the anticancer properties of these step-wise surface-controlled materials against A549 human lung carcinoma cells, which showed a similar toxicity pattern. These studies highlight that the nanomaterial toxicity and biological applicability are strongly governed by their surface corona.
Training Teachers to Teach Probability
Batanero, Carmen; Godino, Juan D.; Roa, Rafael
2004-01-01
In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
DEFF Research Database (Denmark)
Goutam, Shovon; Timmermans, Jean-Marc; Omar, Noshin
2014-01-01
This experimental work attempts to determine the surface temperature evolution of large (20 Ah-rated capacity) commercial Lithium-Ion pouch cells for the application of rechargeable energy storage of plug in hybrid electric vehicles and electric vehicles. The cathode of the cells is nickel...
Directory of Open Access Journals (Sweden)
Hauchecorne Alain
2016-01-01
Full Text Available A concept of innovative rotational Raman lidar with daylight measurement capability is proposed to measure the vertical profile of temperature from the ground to the middle stratosphere. The optical filtering is made using a Fabry-Pérot Interferometer with line spacing equal to the line spacing of the Raman spectrum. The detection is made using a linear PMT array operated in photon counting mode. We plan to build a prototype and to test it at the Haute-Provence Observatory lidar facility. to achieve a time resolution permitting the observation of small-scale atmospheric processes playing a role in the troposphere-stratosphere interaction as gravity waves. If successful, this project will open the possibility to consider a Raman space lidar for the global observation of atmospheric temperature profiles.
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from February 24,...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from March 1, 1996 to...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from surface sensors and CTD casts from NOAA Ship DISCOVERER and other platforms from 22 October 1981 to 13 October 1982....
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from surface sensors, bottle casts, and CTD casts from the R/V ALPHA HELIX from 27 July 1986 to 11 November 1987. Data were...
National Oceanic and Atmospheric Administration, Department of Commerce — The data in the attached files are near-surface temperature profiles collected by an Arctic Wave Glider (AWG) from July 29-Sept 23, 2011. Temperatures were collected...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from September 19,...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from multiple vessels in a world wide distribution from June 29, 1994 to...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from surface sensors, bottle casts, and CTD casts from NOAA Ship MILLER FREEMAN from 18 September 1986 to 14 May 1987. Data...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature and salinity profiles were collected from surface sensors and CTD casts in the Gulf of Alaska from the SURVEYOR. Data were collected by the Pacific...
Lin, J.-T.; Martin, R. V.; Boersma, K. F.; Sneep, M.; Stammes, P.; Spurr, R.; Wang, P.; Van Roozendael, M.; Clémer, K.; Irie, H.
2014-02-01
Retrievals of tropospheric nitrogen dioxide (NO2) from the Ozone Monitoring Instrument (OMI) are subject to errors in the treatments of aerosols, surface reflectance anisotropy, and vertical profile of NO2. Here we quantify the influences over China via an improved retrieval process. We explicitly account for aerosol optical effects (simulated by nested GEOS-Chem at 0.667° long. × 0.5° lat. and constrained by aerosol measurements), surface reflectance anisotropy, and high-resolution vertical profiles of NO2 (simulated by GEOS-Chem). Prior to the NO2 retrieval, we derive the cloud information using consistent ancillary assumptions. We compare our retrieval to the widely used DOMINO v2 product, using MAX-DOAS measurements at three urban/suburban sites in East China as reference and focusing the analysis on the 127 OMI pixels (in 30 days) closest to the MAX-DOAS sites. We find that our retrieval reduces the interference of aerosols on the retrieved cloud properties, thus enhancing the number of valid OMI pixels by about 25%. Compared to DOMINO v2, our retrieval better captures the day-to-day variability in MAX-DOAS NO2 data (R2 = 0.96 versus 0.72), due to pixel-specific radiative transfer calculations rather than the use of a look-up table, explicit inclusion of aerosols, and consideration of surface reflectance anisotropy. Our retrieved NO2 columns are 54% of the MAX-DOAS data on average, reflecting the inevitable spatial inconsistency between the two types of measurement, errors in MAX-DOAS data, and uncertainties in our OMI retrieval related to aerosols and vertical profile of NO2. Sensitivity tests show that excluding aerosol optical effects can either increase or decrease the retrieved NO2 for individual OMI pixels with an average increase by 14%. Excluding aerosols also complexly affects the retrievals of cloud fraction and particularly cloud pressure. Employing various surface albedo data sets slightly affects the retrieved NO2 on average (within 10%). The
Probability machines: consistent probability estimation using nonparametric learning machines.
Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A
2012-01-01
Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Chasoglou, D.; Hryha, E.; Norell, M.; Nyborg, L.
2013-03-01
Characterization of oxide products on the surface of water-atomized steel powder is essential in order to determine the reducing conditions required for their removal during the sintering stage which in turn will result in improved mechanical properties. Pre-alloyed powder with 3 wt% Cr and 0.5 wt% Mo was chosen as the model material. Investigation of the powder surface characteristics with regard to composition, morphology, size and distribution of surface oxides was performed using X-ray photoelectron spectroscopy, Auger electron spectroscopy and high resolution scanning electron microscopy combined with X-ray microanalysis. The analysis revealed that the powder is covered by a homogeneous (˜6 nm thick) Fe-oxide layer to ˜94% whereas the rest is covered by fine particulate features with the size below 500 nm. These particulates were further analyzed and were divided into three main categories (i) Cr-based oxides with simultaneous presence of nitrogen, (ii) Si-based oxides of "hemispherical" shape and (iii) agglomerates of the afore mentioned oxides.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Statistical analysis of wind profile in the surface layer at the Alcântara launching center
Directory of Open Access Journals (Sweden)
Carlos Alberto Ferreira Gisler
2011-05-01
Full Text Available Statistical analysis of the wind profile made at an anemometric tower installed at the Alcântara Launching Center was based on wind data (direction and wind speed collected between 1995 up to 1999, which was carried out at six different levels: 6.0, 10.0, 16.3, 28.5, 43.0, and 70.0 m. This analysis was made for typical rainy months (March and dry (September season, in the Alcântara Launching Center area. The analyzed data total during the wet season (March was 76,882 wind profiles (time interval of ten minutes and during the dry season (September was of 109,809 profiles. It was computed the mean wind speed (or intensity of the wind, standard deviation, median, mode, and the prevailing wind direction. The predominant direction was from NE with 33 and 40% for wet and dry seasons, respectively. The average values of wind speed showed a dependency with height and it was observed that the highest levels of the anemometric tower have the strongest wind speed in the dry period (8.2 ms-¹. The values of average wind speeds observed were 6.4 ms-¹ for the dry season and 4.1 ms-¹ during the wet one. The normal and Weibull statistical distributions were adjusted to the observed data set. Results show that the wind speed is adjusted to a 95% level (α=0.05 for the normal and Weibull statistical distributions. The Weibull distribution for the entire period presented and adjust to values between 3.0 and 9.0 ms-¹, and the normal one showed a good fit for values between 4.0 and 9.0 ms-¹.
DEFF Research Database (Denmark)
Bottoli, Federico; Christiansen, Thomas Lundin; Winther, Grethe
2016-01-01
The present work deals with the evaluation of the residual stress profiles in expanded austenite by applying grazing incidence X-ray diffraction (GI-XRD) combined with successive sublayer removal. Annealed and deformed (εeq=0.5) samples of stable stainless steel EN 1.4369 were nitrided or nitroca...... deformation in the steel prior to thermochemical treatment has a hardly measurable influence on the nitrogen-rich zone, while it has a measurable effect on the stresses and depth of the carbon-rich zone.......The present work deals with the evaluation of the residual stress profiles in expanded austenite by applying grazing incidence X-ray diffraction (GI-XRD) combined with successive sublayer removal. Annealed and deformed (εeq=0.5) samples of stable stainless steel EN 1.4369 were nitrided...... or nitrocarburized. The residual stress profiles resulting from the thermochemical low-temperature surface treatment were measured. The results indicate high-residual compressive stresses of several GPa’s in the nitrided region, while lower-compressive stresses are produced in the carburized case. Plastic...
Criminal Psychological Profiling
1993-10-18
landmark report became known to the general population. Dr. Langer’s profile broke new ground. While the practice of psychoanalysis was not new, this marked...school or college dropout. Suspect is probably suffering from one or more forms of paranoid psychosis .6 Perpetrator: Based on this profile, the police
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Lin, J.-T.; Martin, R. V.; Boersma, K. F.; Sneep, M.; Stammes, P.; Spurr, R.; Wang, P.; Van Roozendael, M.; Clémer, K.; Irie, H.
2013-08-01
Retrievals of tropospheric nitrogen dioxide (NO2) from the Ozone Monitoring Instrument (OMI) are subject to errors in the treatments of aerosols, surface reflectance anisotropy, and vertical profile of NO2. Here we quantify the influences over China via an improved retrieval process. We explicitly account for aerosol optical effects (simulated by nested GEOS-Chem at 0.667° lon × 0.5° lat and constrained by aerosol measurements), surface reflectance anisotropy, and high-resolution vertical profiles of NO2 (simulated by GEOS-Chem). Prior to the NO2 retrieval, we derive the cloud information using consistent ancillary assumptions. We compare our retrieval to the widely used DOMINO v2 product, using as reference MAX-DOAS measurements at three urban/suburban sites in East China and focusing the analysis on the 127 OMI pixels (in 30 days) closest to the MAX-DOAS sites. We find that our retrieval reduces the interference of aerosols on the retrieved cloud properties, thus enhancing the number of valid OMI pixels by about 25%. Compared to DOMINO v2, our retrieval improves the correlation with the MAX-DOAS data in the day-to-day variability of NO2 (R2 = 0.96 vs. 0.72). Our retrieved NO2 columns are about 50% of the MAX-DOAS data on average. This reflects the inevitable spatial inconsistency between the two types of measurement, uncertainties in MAX-DOAS data, and residual uncertainties in our OMI retrievals related to aerosols and vertical profile of NO2. Through a series of tests, we find that excluding aerosol scattering/absorption can either increase or decrease the retrieved NO2, with a mean absolute difference by about 20%. Concentrating aerosols at the boundary layer top enhances the retrieved NO2 by 8% on average with a mean absolute difference by 23%. The aerosol perturbations also affect nonlinearly the retrieved cloud fraction and particularly cloud pressure. Employing various surface albedo datasets alters the retrieved NO2 by 0-7% on average. The retrieved NO
Fukushima, Toshio
2016-03-01
We developed a numerical method to compute the gravitational field of an infinitely thin axisymmetric disc with an arbitrary surface mass density profile. We evaluate the gravitational potential by a split quadrature using the double exponential rule and obtain the acceleration vector by numerically differentiating the potential by Ridder's algorithm. The new method is of around 12 digit accuracy and sufficiently fast because requiring only one-dimensional integration. By using the new method, we show the rotation curves of some non-trivial discs: (i) truncated power-law discs, (ii) discs with a non-negligible centre hole, (iii) truncated Mestel discs with edge softening, (iv) double power-law discs, (v) exponentially damped power-law discs, and (vi) an exponential disc with a sinusoidal modulation of the density profile. Also, we present a couple of model fittings to the observed rotation curve of M33: (i) the standard deconvolution by assuming a spherical distribution of the dark matter and (ii) a direct fit of infinitely thin disc mass with a double power-law distribution of the surface mass density. Although the number of free parameters is a little larger, the latter model provides a significantly better fit. The FORTRAN 90 programs of the new method are electronically available.
Considerations on a posteriori probability
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of prior probabilities according to the statistical frequency obtained from statistical data.
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....
Transition Probabilities of Gd I
Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.
2011-01-01
Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.
Non-destructive depth profiling analysis of {beta}-FeSi{sub 2} surface by means of SR-XPS
Energy Technology Data Exchange (ETDEWEB)
Saito, Takeru; Yamamoto, Hiroyuki; Yamaguchi, Kenji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others
2003-01-01
The formation process of {beta}-FeSi{sub 2} on Si(111) substrate by the solid-phase epitaxy (SPE) method and the oxidation process of {beta}-FeSi{sub 2} were investigated by means of X-ray photoelectron spectroscopy using synchrotron radiation (SR-XPS). The comparisons of the experimental results with simulation using electron inelastic mean free path (IMFP) revealed that Fe layer was formed after Fe deposition on Si(111) surface at room temperature and that Si was diffused into Fe gradually by subsequent annealing. The core-level and valence-band photoemission spectra revealed the formation of {beta}-FeSi{sub 2} by the annealing above 723 K. After annealing at 973 K, thin Si layer was formed on topmost of the {beta}-FeSi{sub 2} surface. After exposing to the atmosphere at room temperature, the extreme surface of the {beta}-FeSi{sub 2} phase and the exposed region of Si substrate were almost completely covered by the thin uniform SiO{sub 2} layer. From this result, it is speculated that the thin SiO{sub 2} layer acts as protective layer for oxidation of {beta}-FeSi{sub 2} surface. (author)
Scharffenberg, Martin G.; Biri, Stavroula; Stammer, Detlef
2013-09-01
Geostrophic velocity Probability Density Functions (PDF), Skewness (S) and Kurtosis (K) are shown for both velocity components (u, v) estimated from the 3- year long Jason-1 - TOPEX/Poseidon (JTP) Tandem Mission which allowed infer both velocity components directly from the altimeter observations. To be comparable to previous results of velocity- (w) and SSH-PDF, we include the 18.5-year time series of SSH from the TOPEX/Poseidon, Jason-1 and Jason-2 (TPJJ) missions.The differences in the PDF of both velocity components are found to be evident, with a wider shape for the zonal velocity component due to the larger variability in zonal direction. Results confirm that the exponential shape of the global velocity PDF is a consequence of the spatially inhomogeneous EKE distribution over the global ocean. Only regions with a small variance in EKE, have Gaussian shaped PDF, however, normalizing each time series with their STD results in Gaussian PDF everywhere.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Su, Jiunn-Yih; Andersson, Patiyan; Holt, Deborah C.
2017-01-01
Background The microbiome of built environment surfaces is impacted by the presence of humans. In this study, we tested the hypothesis that analysis of surface swabs from clinic toilet/bathroom yields results correlated with sexually transmitted infection (STI) notifications from corresponding human populations. We extended a previously reported study in which surfaces in toilet/bathroom facilities in primary health clinics in the Australian Northern Territory (NT) were swabbed then tested for nucleic acid from the STI agents Chlamydia trachomatis, Neisseria gonorrhoeae and Trichomonas vaginalis. This was in the context of assessing the potential for such nucleic acid to contaminate specimens collected in such facilities. STIs are notifiable in the NT, thus allowing comparison of swab and notification data. Methods An assumption in the design was that while absolute built environment loads of STI nucleic acids will be a function of patient traffic density and facility cleaning protocols, the relative loads of STI nucleic acids from different species will be largely unaffected by these processes. Another assumption was that the proportion of swabs testing positive for STIs provides a measure of surface contamination. Accordingly, “STI profiles” were calculated. These were the proportions that each of the three STIs of interest contributed to the summed STI positive swabs or notifications. Three comparisons were performed, using swab data from clinics in remote Indigenous communities, clinics in small-medium towns, and a single urban sexual health clinic. These data were compared with time and place-matched STI notifications. Results There were significant correlations between swab and notifications data for the both the remote Indigenous and regional data. For the remote Indigenous clinics the p values ranged from 0.041 to 0.0089, depending on data transformation and p value inference method. Further, the swab data appeared to strongly indicate known higher
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Matyunina Lilya V
2009-12-01
Full Text Available Abstract Background Accumulating evidence suggests that somatic stem cells undergo mutagenic transformation into cancer initiating cells. The serous subtype of ovarian adenocarcinoma in humans has been hypothesized to arise from at least two possible classes of progenitor cells: the ovarian surface epithelia (OSE and/or an as yet undefined class of progenitor cells residing in the distal end of the fallopian tube. Methods Comparative gene expression profiling analyses were carried out on OSE removed from the surface of normal human ovaries and ovarian cancer epithelial cells (CEPI isolated by laser capture micro-dissection (LCM from human serous papillary ovarian adenocarcinomas. The results of the gene expression analyses were randomly confirmed in paraffin embedded tissues from ovarian adenocarcinoma of serous subtype and non-neoplastic ovarian tissues using immunohistochemistry. Differentially expressed genes were analyzed using gene ontology, molecular pathway, and gene set enrichment analysis algorithms. Results Consistent with multipotent capacity, genes in pathways previously associated with adult stem cell maintenance are highly expressed in ovarian surface epithelia and are not expressed or expressed at very low levels in serous ovarian adenocarcinoma. Among the over 2000 genes that are significantly differentially expressed, a number of pathways and novel pathway interactions are identified that may contribute to ovarian adenocarcinoma development. Conclusions Our results are consistent with the hypothesis that human ovarian surface epithelia are multipotent and capable of serving as the origin of ovarian adenocarcinoma. While our findings do not rule out the possibility that ovarian cancers may also arise from other sources, they are inconsistent with claims that ovarian surface epithelia cannot serve as the origin of ovarian cancer initiating cells.
Energy Technology Data Exchange (ETDEWEB)
Cefalas, A.C., E-mail: ccefalas@eie.gr [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Kollia, Z.; Spyropoulos-Antonakakis, N.; Gavriil, V. [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Christofilos, D.; Kourouklis, G. [Physics Division, School of Technology, Aristotle University of Thessaloniki, Thessaloniki 54124 (Greece); Semashko, V.V.; Pavlov, V. [Kazan Federal University, Institute of Physics, 18 Kremljovskaja str., Kazan 420008 (Russian Federation); Sarantopoulou, E. [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Kazan Federal University, Institute of Physics, 18 Kremljovskaja str., Kazan 420008 (Russian Federation)
2017-02-28
Highlights: • The work links the surface morphology of amorphous semiconductors with both their electric-thermal properties and current stability at the nanoscale (<1 μm). • Measured high correlation value between surface morphological spatial gradient and conductive electron energy spatial gradient or thermal gradient. • Unidirectional current stability is associated with asymmetric nanodomains along nanosize conductive paths. • Bidirectional current stability is inherent with either long conductive paths or nanosize conductive paths along symmetric nanodomains. • Conclusion: Surface design improves current stability across nanoelectonic junctions. - Abstract: A link between the morphological characteristics and the electric properties of amorphous layers is established by means of atomic, conductive, electrostatic force and thermal scanning microscopy. Using amorphous Ta{sub 2}O{sub 5} (a-Ta{sub 2}O{sub 5}) semiconductive layer, it is found that surface profile gradients (morphological gradient), are highly correlated to both the electron energy gradient of trapped electrons in interactive Coulombic sites and the thermal gradient along conductive paths and thus thermal and electric properties are correlated with surface morphology at the nanoscale. Furthermore, morphological and electron energy gradients along opposite conductive paths of electrons intrinsically impose a current stability anisotropy. For either long conductive paths (L > 1 μm) or along symmetric nanodomains, current stability for both positive and negative currents i is demonstrated. On the contrary, for short conductive paths along non-symmetric nanodomains, the set of independent variables (L, i) is spanned by two current stability/intability loci. One locus specifies a stable state for negative currents, while the other locus also describes a stable state for positive currents.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Flood hazard probability mapping method
Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart
2015-04-01
In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.
Incompatible Stochastic Processes and Complex Probabilities
Zak, Michail
1997-01-01
The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.
Directory of Open Access Journals (Sweden)
Gomez-Mancilla Baltazar
2006-04-01
Full Text Available Abstract Cerebrospinal fluid (CSF potentially carries an archive of peptides and small proteins relevant to pathological processes in the central nervous system (CNS and surrounding brain tissue. Proteomics is especially well suited for the discovery of biomarkers of diagnostic potential in CSF for early diagnosis and discrimination of several neurodegenerative diseases. ProteinChip surface-enhanced laser-desorption/ionization time-of-flight mass spectrometry (SELDI-TOF-MS is one such approach which offers a unique platform for high throughput profiling of peptides and small proteins in CSF. In this study, we evaluated methodologies for the retention of CSF proteins m/z we found a high degree of overlap between the tested array surfaces. The combination of CM10 and IMAC30 arrays was sufficient to represent between 80–90% of all assigned peaks when using either sinapinic acid or α-Cyano-4-hydroxycinnamic acid as the energy absorbing matrices. Moreover, arrays processed with SPA consistently showed better peak resolution and higher peak number across all surfaces within the measured mass range. We intend to use CM10 and IMAC30 arrays prepared in sinapinic acid as a fast and cost-effective approach to drive decisions on sample selection prior to more in-depth discovery of diagnostic biomarkers in CSF using alternative but complementary proteomic strategies.
Energy Technology Data Exchange (ETDEWEB)
Malak, M.; Marty, F.; Bourouina, T. [Universite Paris-Est, Laboratoire ESYCOM, ESIEE Paris, Cite Descartes, 2 Boulevard Blaise Pascal, 93162 Noisy-le-Grand Cedex (France); Nouira, H.; Vailleau, G. [Laboratoire National de Metrologie et d' Essais, 1 rue Gaston Boissier, 75724 Paris Cedex 15 (France)
2013-04-08
A miniature Michelson interferometer is analyzed theoretically and experimentally. The fabricated micro-interferometer is incorporated at the tip of a monolithic silicon probe to achieve contactless distance measurements and surface profilometry. For infrared operation, two approaches are studied, based on the use of monochromatic light and wavelength sweep, respectively. A theoretical model is devised to depict the system characteristics taking into account Gaussian beam divergence and light spot size. Furthermore, preliminary results using visible light demonstrate operation of the probe as a visible light spectrometer, despite silicon absorbance, thanks to the micrometer thickness involved in the beam splitter.
Probability measures on metric spaces
Parthasarathy, K R
2005-01-01
In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Directory of Open Access Journals (Sweden)
Suma George Mulamattathil
2014-01-01
Full Text Available The aim of this study was to isolate and identify environmental bacteria from various raw water sources as well as the drinking water distributions system in Mafikeng, South Africa, and to determine their antibiotic resistance profiles. Water samples from five different sites (raw and drinking water were analysed for the presence of faecal indicator bacteria as well as Aeromonas and Pseudomonas species. Faecal and total coliforms were detected in summer in the treated water samples from the Modimola dam and in the mixed water samples, with Pseudomonas spp. being the most prevalent organism. The most prevalent multiple antibiotic resistance phenotype observed was KF-AP-C-E-OT-K-TM-A. All organisms tested were resistant to erythromycin, trimethoprim, and amoxicillin. All isolates were susceptible to ciprofloxacin and faecal coliforms and Pseudomonas spp. to neomycin and streptomycin. Cluster analysis based on inhibition zone diameter data suggests that the isolates had similar chemical exposure histories. Isolates were identified using gyrB, toxA, ecfX, aerA, and hylH gene fragments and gyrB, ecfX, and hylH fragments were amplified. These results demonstrate that (i the drinking water from Mafikeng contains various bacterial species and at times faecal and total coliforms. (ii The various bacteria are resistant to various classes of antibiotics.
Directory of Open Access Journals (Sweden)
Philip M. Giffard
2017-06-01
Full Text Available Background The microbiome of built environment surfaces is impacted by the presence of humans. In this study, we tested the hypothesis that analysis of surface swabs from clinic toilet/bathroom yields results correlated with sexually transmitted infection (STI notifications from corresponding human populations. We extended a previously reported study in which surfaces in toilet/bathroom facilities in primary health clinics in the Australian Northern Territory (NT were swabbed then tested for nucleic acid from the STI agents Chlamydia trachomatis, Neisseria gonorrhoeae and Trichomonas vaginalis. This was in the context of assessing the potential for such nucleic acid to contaminate specimens collected in such facilities. STIs are notifiable in the NT, thus allowing comparison of swab and notification data. Methods An assumption in the design was that while absolute built environment loads of STI nucleic acids will be a function of patient traffic density and facility cleaning protocols, the relative loads of STI nucleic acids from different species will be largely unaffected by these processes. Another assumption was that the proportion of swabs testing positive for STIs provides a measure of surface contamination. Accordingly, “STI profiles” were calculated. These were the proportions that each of the three STIs of interest contributed to the summed STI positive swabs or notifications. Three comparisons were performed, using swab data from clinics in remote Indigenous communities, clinics in small-medium towns, and a single urban sexual health clinic. These data were compared with time and place-matched STI notifications. Results There were significant correlations between swab and notifications data for the both the remote Indigenous and regional data. For the remote Indigenous clinics the p values ranged from 0.041 to 0.0089, depending on data transformation and p value inference method. Further, the swab data appeared to strongly indicate
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability and statistics: A reminder
Clément, Benoit
2013-07-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Probability and statistics: A reminder
Clément Benoit
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...
Directory of Open Access Journals (Sweden)
С. А. Игнатьев
2016-08-01
Full Text Available The paper addresses an issue of creating an environment favorable for the life in megacities by planting vegetation on the rooftops. It also provides information about rooftop greening practices adopted in other countries. The issues of ‘green roof’ building in climatic conditions of Saint Petersburg and roof vegetation impact on the urban ecosystem are examined. Vegetation composition quality- and quantity-wise has been proposed for the roof under research and a 3D model of this roof reflecting its geometric properties has been developed. A structure of roof covering and substrate qualitative composition is presented. An effect of rooftop geometry on the substrate temperature is explored. The annual substrate temperature and moisture content in different parts of the roof have been analyzed. Results of thermal imaging monitoring and insolation modelling for different parts of green roof surface are presented.
ASNOM mapping of SiC epi-layer doping profile and of surface phonon polariton waveguiding
Kazantsev, Dmitry
2013-01-01
The apertureless SNOM mapping of the slightly-doped 4H-SiC epitaxial layer grown on a heavily-doped 4H-SiC substrate was performed with a cleaved edge geometry. ASNOM images taken at the light frequencies of a $C^{13}O_{2}^{16}$ laser show a clear contrast between the substrate and the epitaxial layer. The contrast vanishes at the laser frequency of $884cm^{-1}$, and gets clearer at higher frequencies $(923cm^{-1})$. This can be explained by changes in the local polarizability of SiC caused by the carrier concentration, which are more pronounced at higher frequencies. Since the light frequency is tuned up further ($935cm^{-1}$), a transversal mode structure appears in the ASNOM map, indicating a waveguide-like confinement of a surface phonon polariton wave inside the strip of an epi-layer outcrop.
Fabini, Edoardo; Fiori, Giovana Maria Lanchoti; Tedesco, Daniele; Lopes, Norberto Peporine; Bertucci, Carlo
2016-04-15
Cucurbitacins are a group of tetracyclic triterpenoids, known for centuries for their anti-cancer and anti-inflammatory properties, which are being actively investigated over the past decades in order to elucidate their mechanism of action. In perspective of being used as therapeutic molecules, a pharmacokinetic characterization is crucial to assess the affinity toward blood carrier proteins and extrapolate distribution volumes. Usually, pharmacokinetic data are first collected on animal models and later translated to humans; therefore, an early characterization of the interaction with carrier proteins from different species is highly desirable. In the present study, the interactions of cucurbitacins E and I with human and rat serum albumins (HSA and RSA) were investigated by means of surface plasmon resonance (SPR)-based optical biosensing and circular dichroism (CD) spectroscopy. Active HSA and RSA sensor chip surfaces were prepared through an amine coupling reaction protocol, and the equilibrium dissociation constants (Kd) for the different cucurbitacins-serum albumins complexes were then determined by SPR analysis. Further information on the binding of cucurbitacins to serum albumins was obtained by CD competition experiments with biliverdin, a specific marker binding to subdomain IB of HSA. SPR data unveiled a previously unreported binding event between CucI and HSA; the determined binding affinities of both compounds were slightly higher for RSA with respect to HSA, even though all the compounds can be ranked as high-affinity binders for both carriers. CD analysis showed that the two cucurbitacins modify the binding of biliverdin to serum albumins through opposite allosteric modulation (positive for HSA, negative for RSA), confirming the need for caution in the translation of pharmacokinetic data across species. Copyright © 2016 Elsevier B.V. All rights reserved.
Femtosecond laser-induced surface wettability modification of polystyrene surface
Wang, Bing; Wang, XinCai; Zheng, HongYu; Lam, YeeCheong
2016-12-01
In this paper, we demonstrated a simple method to create either a hydrophilic or hydrophobic surface. With femtosecond laser irradiation at different laser parameters, the water contact angle (WCA) on polystyrene's surface can be modified to either 12.7° or 156.2° from its original WCA of 88.2°. With properly spaced micro-pits created, the surface became hydrophilic probably due to the spread of the water droplets into the micro-pits. While with properly spaced micro-grooves created, the surface became rough and more hydrophobic. We investigated the effect of laser parameters on WCAs and analyzed the laser-treated surface roughness, profiles and chemical bonds by surface profilometer, scanning electron microscope (SEM) and X-ray photoelectron spectroscopy (XPS). For the laser-treated surface with low roughness, the polar (such as C—O, C=O, and O—C=O bonds) and non-polar (such as C—C or C—H bonds) groups were found to be responsible for the wettability changes. While for a rough surface, the surface roughness or the surface topography structure played a more significant role in the changes of the surface WCA. The mechanisms involved in the laser surface wettability modification process were discussed.
Chandran, Parwathy; Riviere, Jim E; Monteiro-Riviere, Nancy A
2017-05-01
This study investigated the role of nanoparticle size and surface chemistry on biocorona composition and its effect on uptake, toxicity and cellular responses in human umbilical vein endothelial cells (HUVEC), employing 40 and 80 nm gold nanoparticles (AuNP) with branched polyethyleneimine (BPEI), lipoic acid (LA) and polyethylene glycol (PEG) coatings. Proteomic analysis identified 59 hard corona proteins among the various AuNP, revealing largely surface chemistry-dependent signature adsorbomes exhibiting human serum albumin (HSA) abundance. Size distribution analysis revealed the relative instability and aggregation inducing potential of bare and corona-bound BPEI-AuNP, over LA- and PEG-AuNP. Circular dichroism analysis showed surface chemistry-dependent conformational changes of proteins binding to AuNP. Time-dependent uptake of bare, plasma corona (PC) and HSA corona-bound AuNP (HSA-AuNP) showed significant reduction in uptake with PC formation. Cell viability studies demonstrated dose-dependent toxicity of BPEI-AuNP. Transcriptional profiling studies revealed 126 genes, from 13 biological pathways, to be differentially regulated by 40 nm bare and PC-bound BPEI-AuNP (PC-BPEI-AuNP). Furthermore, PC formation relieved the toxicity of cationic BPEI-AuNP by modulating expression of genes involved in DNA damage and repair, heat shock response, mitochondrial energy metabolism, oxidative stress and antioxidant response, and ER stress and unfolded protein response cascades, which were aberrantly expressed in bare BPEI-AuNP-treated cells. NP surface chemistry is shown to play the dominant role over size in determining the biocorona composition, which in turn modulates cell uptake, and biological responses, consequently defining the potential safety and efficacy of nanoformulations.
Probability maps as a measure of reliability for indivisibility analysis
Directory of Open Access Journals (Sweden)
Joksić Dušan
2005-01-01
Full Text Available Digital terrain models (DTMs represent segments of spatial data bases related to presentation of terrain features and landforms. Square grid elevation models (DEMs have emerged as the most widely used structure during the past decade because of their simplicity and simple computer implementation. They have become an important segment of Topographic Information Systems (TIS, storing natural and artificial landscape in forms of digital models. This kind of a data structure is especially suitable for morph metric terrain evaluation and analysis, which is very important in environmental and urban planning and Earth surface modeling applications. One of the most often used functionalities of Geographical information systems software packages is indivisibility or view shed analysis of terrain. Indivisibility determination from analog topographic maps may be very exhausting, because of the large number of profiles that have to be extracted and compared. Terrain representation in form of the DEMs databases facilitates this task. This paper describes simple algorithm for terrain view shed analysis by using DEMs database structures, taking into consideration the influence of uncertainties of such data to the results obtained thus far. The concept of probability maps is introduced as a mean for evaluation of results, and is presented as thematic display.
Surface morphology and depth profile study of Cd{sub 1-x}Zn{sub x}Te alloy nanostructures
Energy Technology Data Exchange (ETDEWEB)
Yilmaz, Ercan, E-mail: yilmaz@ibu.edu.tr [Physics Department, Abant Izzet Baysal University, 14280 Bolu (Turkey); Tugay, Evrin [Physics Department, Middle East Technical University, 06531 Ankara (Turkey); Center for Solar Energy Research and Applications (GUeNAM), Middle East Technical University, 06531 Ankara (Turkey); Aktag, Aliekber [Physics Department, Abant Izzet Baysal University, 14280 Bolu (Turkey); Yildiz, Ilker [Physics Department, Middle East Technical University, 06531 Ankara (Turkey); Parlak, Mehmet; Turan, Rasit [Physics Department, Middle East Technical University, 06531 Ankara (Turkey); Center for Solar Energy Research and Applications (GUeNAM), Middle East Technical University, 06531 Ankara (Turkey)
2012-12-25
Highlights: Black-Right-Pointing-Pointer Cd{sub 1-x}Zn{sub x}Te (CZT) films were grown on heated glass at 400 Degree-Sign C from a single target. Black-Right-Pointing-Pointer CZT films were annealed at 300 and 450 Degree-Sign C for 1 h under N{sub 2} gas at atm. pressure. Black-Right-Pointing-Pointer The structural and optical properties of CZT films were studied. Black-Right-Pointing-Pointer Better structural stability and reproducibility in CZT films were succeeded. Black-Right-Pointing-Pointer Uniform and stoichiometric CZT films with required compositions were fabricated. - Abstract: Cd{sub 1-x}Zn{sub x}Te thin films with thickness of 200 nm were deposited on glass substrates from a single sputtering target. During the deposition process, the substrates were heated at 400 Degree-Sign C and deposited films were subjected to an annealing process at 300 and 450 Degree-Sign C for an hour under flowing N{sub 2} gas at atmospheric pressure. Influence of in situ heating and post-deposition annealing treatments on the structural and optical evolution of Cd{sub 1-x}Zn{sub x}Te nanostructures were investigated by diagnostic techniques such as X-ray diffraction (XRD), energy dispersive spectroscopy (EDS), scanning electron microscopy (SEM), atomic force microscopy (AFM), X-ray photoelectron spectroscopy (XPS), and UV-transmission spectroscopy. The transmission spectra in the region of the optical absorption band edge were measured for as-deposited and heat-treated of CdZnTe samples. Band gap of the deposited films were found to be in the range of 1.59-1.66 eV. The XRD studies revealed that heated Cd{sub 1-x}Zn{sub x}Te films have a cubic oriented (1 1 1), (2 2 0) and (3 1 1) polycrystalline structure whereas unheated films are mostly amorphous. The effects of annealing temperature on the composition of the thin films were discussed. XPS measurements were performed in the depth profiling mode in order to understand the variation in the chemical composition of the films
Moghadas, Davood
2013-01-01
We theoretically investigated the effect of vapor flow on the drying front that develops in soils when water evaporates from the soil surface and on GPR data. The results suggest the integration of the full-wave GPR model with a coupled water, vapor, and heat flow model to accurately estimate the soil hydraulic properties. We investigated the Effects of a drying front that emerges below an evaporating soil surface on the far-field ground-penetrating radar (GPR) data. First, we performed an analysis of the width of the drying front in soils with 12 different textures by using an analytical model. Then, we numerically simulated vertical soil moisture profiles that develop during evaporation for the soil textures. We performed the simulations using a Richards flow model that considers only liquid water flow and a model that considers coupled water, vapor, and heat flows. The GPR signals were then generated from the simulated soil water content profiles taking into account the frequency dependency of apparent electrical conductivity and dielectric permittivity. The analytical approach indicated that the width of the drying front at the end of Stage I of the evaporation was larger in silty soils than in other soil textures and smaller in sandy soils. We also demonstrated that the analytical estimate of the width of the drying front can be considered as a proxy for the impact that a drying front could have on far-field GPR data. The numerical simulations led to the conclusion that vapor transport in soil resulted in S-shaped soil moisture profiles, which clearly influenced the GPR data. As a result, vapor flow needs to be considered when GPR data are interpreted in a coupled inversion approach. Moreover, the impact of vapor flow on the GPR data was larger for silty than for sandy soils. These Effects on the GPR data provide promising perspectives regarding the use of radars for evaporation monitoring. © Soil Science Society of America 5585 Guilford Rd., Madison, WI
Gurumurthy, Srividya; Iyer, Geetha; Srinivasan, Bhaskar; Agarwal, Shweta; Angayarkanni, Narayanasamy
2018-02-01
To study the tear cytokine and the conjunctival and oral mucosal marker profile in chronic ocular Stevens-Johnson syndrome (SJS) and their alteration following mucous membrane grafting (MMG) for lid margin keratinisation (LMK). In a 1-year prospective study, SJS cases (n=25) and age-matched/sex-matched healthy controls (n=25) were recruited. Tear specimen (Schirmer's strip), conjunctival and oral mucosal imprints were collected from controls and SJS cases pre-MMG and post-MMG (at first follow-up, n=17). Tear cytokines were profiled using 27-bioplex array. Transforming growth factor-beta (TGF-β)-mediated extracellular matrix changes in conjunctival and oral mucosal cells were analysed by gene expression studies. 30 RESULTS: Tear cytokine profiling of chronic SJS cases at pre-MMG stage revealed significant upregulation of cytokines granulocyte-macrophage colony-stimulating factor (GM-CSF), interleukin (IL)-8, IL-1β, monocyte chemoattractant protein-1, IL-15, IL-2, IL-17A and basic fibroblast growth factor (bFGF) with downregulation of IP-10 (interferon gamma-induced protein 10), tumour necrosis factor-α, interferon-γ, IL-10, vascular endothelial growth factor, regulated upon activation normal T-cell expressed and secreted (RANTES), IL-7, IL-12p70 and IL-13, with maximal increase in GM-CSF and maximal downregulation of IP-10, respectively. Of these, IL-2, IL-15, bFGF and IL-17A showed significant correlation with disease severity, pre-MMG. Conjunctival cells pre-MMG showed increase in TGF-β1, TGF-βRII, connective tissue growth factor and collagen-III gene expression by 10, 67, 173 and 184 folds, respectively, which dropped to 1.3, 11, 13.5 and 19 folds correspondingly, post-MMG. However, their expressions in oral mucosa were negligible. A proinflammatory, profibrotic, antiapoptotic ocular surface milieu characterises chronic ocular SJS. IP-10, an antifibrotic cytokine was noted to be maximally downregulated, unlike in other forms of chronic dry eye disease. The
Fujita, Mikiko; Sato, Tomonori
2017-07-06
Extremely heavy precipitation affects human society and the natural environment, and its behaviour under a warming climate needs to be elucidated. Recent studies have demonstrated that observed extreme precipitation increases with surface air temperature (SAT) at approximately the Clausius-Clapeyron (CC) rate, suggesting that atmospheric water vapour content can explain the relationship between extreme precipitation and SAT. However, the relationship between atmospheric water vapour content and SAT is poorly understood due to the lack of reliable observations with sufficient spatial and temporal coverage for statistical analyses. Here, we analyse the relationship between atmospheric water vapour content and SAT using precipitable water vapour (PWV) derived from global positioning system satellites. A super-CC rate appears in hourly PWV when the SAT is below 16 °C, whereas the rate decreases at high SAT, which is different from the precipitation-SAT relationship. The effects of upper air temperature and water vapour can consistently explain the super-CC rate of PWV relative to SAT. The difference between moist and dry adiabatic lapse rates increases with SAT, in consequence of more ability to hold water vapour in the free atmosphere under higher SAT conditions; therefore, attainable PWV increases more rapidly than the CC rate as SAT increases.
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability, Information and Statistical Physics
Kuzemsky, A. L.
2016-03-01
In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.
Probability on compact Lie groups
Applebaum, David
2014-01-01
Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
The Probabilities of Unique Events
2012-08-30
compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover
Probability and statistics: A reminder
Directory of Open Access Journals (Sweden)
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability, Statistics, and Computational Science
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...
Directory of Open Access Journals (Sweden)
Sebastian Temme
2017-12-01
Full Text Available Epicardium-derived cells (EPDC and atrial stromal cells (ASC display cardio-regenerative potential, but the molecular details are still unexplored. Signals which induce activation, migration and differentiation of these cells are largely unknown. Here we have isolated rat ventricular EPDC and rat/human ASC and performed genetic and proteomic profiling. EPDC and ASC expressed epicardial/mesenchymal markers (WT-1, Tbx18, CD73, CD90, CD44, CD105, cardiac markers (Gata4, Tbx5, troponin T and also contained phosphocreatine. We used cell surface biotinylation to isolate plasma membrane proteins of rEPDC and hASC, Nano-liquid chromatography with subsequent mass spectrometry and bioinformatics analysis identified 396 rat and 239 human plasma membrane proteins with 149 overlapping proteins. Functional GO-term analysis revealed several significantly enriched categories related to extracellular matrix (ECM, cell migration/differentiation, immunology or angiogenesis. We identified receptors for ephrin and growth factors (IGF, PDGF, EGF, anthrax toxin known to be involved in cardiac repair and regeneration. Functional category enrichment identified clusters around integrins, PI3K/Akt-signaling and various cardiomyopathies. Our study indicates that EPDC and ASC have a similar molecular phenotype related to cardiac healing/regeneration. The cell surface proteome repository will help to further unravel the molecular details of their cardio-regenerative potential and their role in cardiac diseases.
Entropy in probability and statistics
Energy Technology Data Exchange (ETDEWEB)
Rolke, W.A.
1992-01-01
The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.
Wang, Yangzhong; Chen, Zhuhai; Liu, Yang; Li, Jinghong
2013-07-01
A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and infections. In this work, a sensitive biosensor for carbohydrate-lectin profiling and in situ cell surface carbohydrate expression was designed by taking advantage of a functional glycoprotein of glucose oxidase acting as both a multivalent recognition unit and a signal amplification probe. Combining the gold nanoparticle catalyzed luminol electrogenerated chemiluminescence and nanocarrier for active biomolecules, the number of cell surface carbohydrate groups could be conveniently read out. The apparent dissociation constant between GOx@Au probes and Con A was detected to be 1.64 nM and was approximately 5 orders of magnitude smaller than that of mannose and Con A, which would arise from the multivalent effect between the probe and Con A. Both glycoproteins and gold nanoparticles contribute to the high affinity between carbohydrates and lectin. The as-proposed biosensor exhibits excellent analytical performance towards the cytosensing of K562 cells with a detection limit of 18 cells, and the mannose moieties on a single K562 cell were determined to be 1.8 × 1010. The biosensor can also act as a useful tool for antibacterial drug screening and mechanism investigation. This strategy integrates the excellent biocompatibility and multivalent recognition of glycoproteins as well as the significant enzymatic catalysis and gold nanoparticle signal amplification, and avoids the cell pretreatment and labelling process. This would contribute to the glycomic analysis and the understanding of complex native glycan-related biological processes.A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and
Li, Wei; Saraiya, Ashesh A.; Wang, Ching C.
2012-01-01
Summary In the current investigation, we analyzed all the known small nucleolar RNAs (snoRNAs) in the deeply branching protozoan parasite Giardia lamblia for potential microRNAs (miRNAs) that might be derived from them. Two putative miRNAs have since been identified by Northern blot, primer extension, 3′-RACE and co-immunoprecipitation with Giardia Argonaute (GlAgo), and designated miR6 and miR10. Giardia Dicer (GlDcr) is capable of processing the snoRNAs into the corresponding miRNAs in vitro. Potential miR6 and miR10 binding sites in Giardia genome were predicted bioinformatically. A miR6 binding site was found at the 3′-untranslated regions (UTR) of 44 variant surface protein (vsp) genes, whereas a miR10 binding site was identified at the 3′-end of 159 vsp open-reading frames. Thirty-three of these vsp genes turned out to contain binding sites for both miR6 and miR10. A reporter mRNA tagged with the 3′ end of vsp1267, which contains the target sites for both miRNAs, was translationally repressed by both miRNAs in Giardia. Episomal expression of an N-terminal c-myc tagged VSP1267 was found significantly repressed by introducing either miR6 or miR10 into the cells and the repressive effects were additive. When the 2′-O-methyl antisense oligos (ASOs) of either miR6 or miR10 was introduced, however, there was an enhancement of tagged VSP1267 expression suggesting an inhibition of the repressive effects of endogenous miR6 or miR10 by the ASOs. Of the total 220 vsp genes in Giardia, we have now found 178 of them carrying putative binding sites for all the miRNAs that have been currently identified, suggesting that miRNAs are likely the regulators of VSP expression in Giardia. PMID:22568619
López-Fernández, Jorge; Gallardo, Leonor; Fernández-Luna, Álvaro; Villacañas, Victor; García-Unanue, Jorge; Sánchez-Sánchez, Javier
2017-06-22
The aim of this research was to evaluate the influence of game surface and pitch size on the movement profile in female soccer players during Small-Sided-Games (SSGs) of 4 v 4. 16 women played three different 4-a-side (400 m, 600 m and 800 m) on three surfaces (ground [GR], artificial turf [AT] and natural grass [NG]). Time-motion variables were assessed through GPS devices (Spi Pro X, GPSports, Australia). GR had the worst outputs on most variables. NG achieved higher results than AT in terms of total distance [SSG 400 (+37.000 m; p=0.006); SSG 600 (+59.989 m; p<0.001); SSG 800 (+42.284 m; p=0.001)]. On the other hand, the smaller SSG (400) had the lowest values on most variables. However, while the middle SSG (600) presented higher output than the bigger one (800) for Body Load [NG (+7.745 a.u.; p<0.001); AT (+8.207 a.u.; p<0.001); GR (+5.879 a.u.; p<0.001)], it had lower results for High Intensity Distance [NG (-13.15 m; p=0.025); AT (-13.59 m; p=0.026)]. Despite women's performance being higher on AT than GR, the NG surface still showed the highest outcomes in the most intense SSG. Moreover, although the performance increase in bigger pitches, if the size is too large the outputs could be reduced.
Frequentist probability and frequentist statistics
Energy Technology Data Exchange (ETDEWEB)
Neyman, J.
1977-01-01
A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)
Energy Technology Data Exchange (ETDEWEB)
Chen, She-Jun [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Feng, An-Hong; He, Ming-Jing; Chen, Man-Ying [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Graduate University of Chinese Academy of Sciences, Beijing 100049 (China); Luo, Xiao-Jun [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Mai, Bi-Xian, E-mail: nancymai@gig.ac.cn [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)
2013-02-01
Polybrominated diphenyl ethers (PBDEs) and alternative flame retardants were measured in surface sediments collected during 2009–2010 from the Pearl River Delta, southern China (a large manufacturing base for electronics/electrical products), to evaluate the influence of China's RoHS directive (adopted in 2006) on their environmental occurrence. The concentrations in sediments from different water systems ranged from 3.67 to 2520 ng/g (average of 17.1–588 ng/g) for PBDEs and from 0.22 to 5270 ng/g (average of 11.3–454 ng/g) for the alternative retardants. Although the PBDE levels have decreased significantly compared with those in sediments collected in 2002 in this region, the levels of alternative decabromodiphenyl ethane (DBDPE) have exceeded those of BDE209 (two predominant halogenated flame retardants (HFRs) in China) in the majority of sediments. This finding suggests a different contaminant pattern of HFRs in current sediments due to the replacement of the deca-BDE mixture with DBDPE in this region. In addition, sediment concentrations of discontinued PBDEs in the rural area are clearly elevated due to e-waste dismantling. The congener profiles of PBDEs in the current sediments (with more abundant lower-brominated congeners) differed substantially from those in 2002 and from the technical products, suggesting that biological or photolytic debromination of PBDEs may have occurred in the environment. - Highlights: ► PBDE levels in sediments have decreased substantially since China's RoHS directive. ► Contamination of novel DBDPE has exceeded that of deca-BDE in the PRD sediments. ► The congener profiles of PBDEs in the sediments have changed significantly. ► Significant biological or photolytic degradation of PBDEs may occur in the environment.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...
Lectures on probability and statistics
Energy Technology Data Exchange (ETDEWEB)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Kaneene, John B.; Miller, RoseAnn; Sayah, Raida; Johnson, Yvette J.; Gilliland, Dennis; Gardiner, Joseph C.
2007-01-01
The goals of this study were to (i) identify issues that affect the ability of discriminant function analysis (DA) of antimicrobial resistance profiles to differentiate sources of fecal contamination, (ii) test the accuracy of DA from a known-source library of fecal Escherichia coli isolates with isolates from environmental samples, and (iii) apply this DA to classify E. coli from surface water. A repeated cross-sectional study was used to collect fecal and environmental samples from Michigan livestock, wild geese, and surface water for bacterial isolation, identification, and antimicrobial susceptibility testing using disk diffusion for 12 agents chosen for their importance in treating E. coli infections or for their use as animal feed additives. Nonparametric DA was used to classify E. coli by source species individually and by groups according to antimicrobial exposure. A modified backwards model-building approach was applied to create the best decision rules for isolate differentiation with the smallest number of antimicrobial agents. Decision rules were generated from fecal isolates and applied to environmental isolates to determine the effectiveness of DA for identifying sources of contamination. Principal component analysis was applied to describe differences in resistance patterns between species groups. The average rate of correct classification by DA was improved by reducing the numbers of species classifications and antimicrobial agents. DA was able to correctly classify environmental isolates when fewer than four classifications were used. Water sample isolates were classified by livestock type. An evaluation of the performance of DA must take into consideration relative contributions of random chance and the true discriminatory power of the decision rules. PMID:17337537
Kaneene, John B; Miller, RoseAnn; Sayah, Raida; Johnson, Yvette J; Gilliland, Dennis; Gardiner, Joseph C
2007-05-01
The goals of this study were to (i) identify issues that affect the ability of discriminant function analysis (DA) of antimicrobial resistance profiles to differentiate sources of fecal contamination, (ii) test the accuracy of DA from a known-source library of fecal Escherichia coli isolates with isolates from environmental samples, and (iii) apply this DA to classify E. coli from surface water. A repeated cross-sectional study was used to collect fecal and environmental samples from Michigan livestock, wild geese, and surface water for bacterial isolation, identification, and antimicrobial susceptibility testing using disk diffusion for 12 agents chosen for their importance in treating E. coli infections or for their use as animal feed additives. Nonparametric DA was used to classify E. coli by source species individually and by groups according to antimicrobial exposure. A modified backwards model-building approach was applied to create the best decision rules for isolate differentiation with the smallest number of antimicrobial agents. Decision rules were generated from fecal isolates and applied to environmental isolates to determine the effectiveness of DA for identifying sources of contamination. Principal component analysis was applied to describe differences in resistance patterns between species groups. The average rate of correct classification by DA was improved by reducing the numbers of species classifications and antimicrobial agents. DA was able to correctly classify environmental isolates when fewer than four classifications were used. Water sample isolates were classified by livestock type. An evaluation of the performance of DA must take into consideration relative contributions of random chance and the true discriminatory power of the decision rules.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Energy Technology Data Exchange (ETDEWEB)
Ari Palczewski, Rongli Geng, Grigory Eremeev
2011-07-01
We designed and built two high resolution (0.6-0.55mm special resolution [1.1-1.2mm separation]) thermometry arrays prototypes out of the Allen Bradley 90-120 ohm 1/8 watt resistor to measure surface temperature profiles on SRF cavities. One array was designed to be physically flexible and conform to any location on a SRF cavity; the other was modeled after the common G-10/stycast 2850 thermometer and designed to fit on the equator of an ILC (Tesla 1.3GHz) SRF cavity. We will discuss the advantages and disadvantages of each array and their construction. In addition we will present a case study of the arrays performance on a real SRF cavity TB9NR001. TB9NR001 presented a unique opportunity to test the performance of each array as it contained a dual (4mm separation) cat eye defect which conventional methods such as OST (Oscillating Superleak second-sound Transducers) and full coverage thermometry mapping were unable to distinguish between. We will discuss the new arrays ability to distinguish between the two defects and their preheating performance.
Karpagam, Rathinasamy; Raj, Kalimuthu Jawahar; Ashokkumar, Balasubramaniem; Varalakshmi, Perumal
2015-01-01
Two fresh water microalgae, Coelastrella sp. M-60 and Micractinium sp. M-13 were investigated in this study for their potential of biodiesel production. For increasing biomass and lipid production, these microalgae were subjected to nutrient starvation (nitrogen, phosphorous, iron), salinity stress and nutrient supplementation with sugarcane industry effluent, citric acid, glucose and vitamin B12. The lipid productivity obtained from the isolates Coelastrella sp. M-60 (13.9 ± 0.4 mg/L/day) and Micractinium sp. M-13 (11.1 ± 0.2 mg/L/day) was maximum in salinity stress. The media supplemented with all the four nutrients yielded higher lipid productivity than the control. The response surface methodology (RSM) was employed to evaluate the effect of sugarcane industry effluent and citric acid on growth and lipid yield. Fatty acid profile of Coelastrella sp. M-60 and Micractinium sp. M-13 were composed of C-14, C-16:0, C-18:0, C-18:1 and C-18:2 and their fuel properties were also in accordance with international standards. Copyright © 2015 Elsevier Ltd. All rights reserved.
Domine, F.; Arnaud, L.; Bock, J.; Carmagnola, C.; Champollion, N.; Gallet, J.; Lesaffre, B.; Morin, S.; Picard, G.
2011-12-01
We have measured vertical profiles of specific surface area (SSA), thermal conductivity (TC) and density in snow from 12 different climatic regions featuring seasonal snowpacks of maritime, Alpine, taiga and tundra types, on Arctic sea ice, and from ice caps in Greenland and Antarctica. We attempt to relate snow physical properties to climatic variables including precipitation, temperature and its yearly variation, wind speed and its short scale temporal variations. As expected, temperature is a key variable that determines snow properties, mostly by determining the metamorphic regime (temperature gradient or equi-temperature) in conjunction with precipitation. However, wind speed and wind speed distribution also seem to have an at least as important role. For example high wind speeds determine the formation of windpacks of high SSA and high TC instead of depth hoar with lower values of these variables. The distribution of wind speed also strongly affects properties, as for example frequent moderate winds result in frequent snow remobilization, producing snow with higher SSA and lower TC than regions with the same average wind speeds, but with less frequent and more intense wind episodes. These strong effects of climate on snow properties imply that climate change will greatly modify snow properties, which in turn will affect climate, as for example changes in snow SSA modify albedo and changes in TC affect permafrost and the release of greenhouse gases from thawing permafrost. Some of these climate-snow feedbacks will be discussed.
Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen
2002-01-01
This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.
Drop-out probabilities of IrisPlex SNP alleles
DEFF Research Database (Denmark)
Andersen, Jeppe Dyrberg; Tvedebrink, Torben; Mogensen, Helle Smidt
2013-01-01
In certain crime cases, information about a perpetrator's phenotype, including eye colour, may be a valuable tool if no DNA profile of any suspect or individual in the DNA database matches the DNA profile found at the crime scene. Often, the available DNA material is sparse and allelic drop......-out when the amount of DNA was greater than 125 pg for 29 cycles of PCR and greater than 62 pg for 30 cycles of PCR. With the use of a logistic regression model, we estimated the allele specific probability of drop-out in heterozygote systems based on the signal strength of the observed allele...
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
A practical overview on probability distributions
Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca
2015-01-01
Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Digital signaling decouples activation probability and population heterogeneity
DEFF Research Database (Denmark)
Kellogg, Ryan A; Tian, Chengzhe; Lipniacki, Tomasz
2015-01-01
Digital signaling enhances robustness of cellular decisions in noisy environments, but it is unclear how digital systems transmit temporal information about a stimulus. To understand how temporal input information is encoded and decoded by the NF-κB system, we studied transcription factor dynamic...... and uniform dynamics. These results show that digital NF-κB signaling enables multidimensional control of cellular phenotype via input profile, allowing parallel and independent control of single-cell activation probability and population heterogeneity....
Front Probability, NOAA GOES Imager, 0.05 degrees, Western Hemisphere, EXPERIMENTAL
National Oceanic and Atmospheric Administration, Department of Commerce — The data indicates the probability of oceanic sea surface temperature fronts off the California coast. They were created using remote sensing sea surface temperature...
A two-locus forensic match probability for subdivided populations.
Ayres, K L
2000-01-01
A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.
California Department of Resources — Beaches are commonly characterized by cross-shore surveys. The resulting profiles represent the elevation of the beach surface and nearshore seabed from the back of...
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...
Probability output modeling for support vector machines
Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian
2007-11-01
In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Probability of flooding: An uncertainty analysis
Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.
1998-01-01
In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could
Lévy processes in free probability
Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen
2002-01-01
This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.
The trajectory of the target probability effect.
Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B
2013-05-01
The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
PROFILER: 1D galaxy light profile decomposition
Ciambur, Bogdan C.
2017-05-01
Written in Python, PROFILER analyzes the radial surface brightness profiles of galaxies. It accurately models a wide range of galaxies and galaxy components, such as elliptical galaxies, the bulges of spiral and lenticular galaxies, nuclear sources, discs, bars, rings, and spiral arms with a variety of parametric functions routinely employed in the field (Sérsic, core-Sérsic, exponential, Gaussian, Moffat and Ferrers). In addition, Profiler can employ the broken exponential model (relevant for disc truncations or antitruncations) and two special cases of the edge-on disc model: namely along the major axis (in the disc plane) and along the minor axis (perpendicular to the disc plane).
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Adolescents' misinterpretation of health risk probability expressions.
Cohn, L D; Schydlower, M; Foley, J; Copeland, R L
1995-05-01
To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).
A practical overview on probability distributions.
Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca
2015-03-01
Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Directory of Open Access Journals (Sweden)
Ignacio M Durante
2017-09-01
Full Text Available The Trypanosoma cruzi genome bears a huge family of genes and pseudogenes coding for Mucin-Associated Surface Proteins (MASPs. MASP molecules display a 'mosaic' structure, with highly conserved flanking regions and a strikingly variable central and mature domain made up of different combinations of a large repertoire of short sequence motifs. MASP molecules are highly expressed in mammal-dwelling stages of T. cruzi and may be involved in parasite-host interactions and/or in diverting the immune response.High-density microarrays composed of fully overlapped 15mer peptides spanning the entire sequences of 232 non-redundant MASPs (~25% of the total MASP content were screened with chronic Chagasic sera. This strategy led to the identification of 86 antigenic motifs, each one likely representing a single linear B-cell epitope, which were mapped to 69 different MASPs. These motifs could be further grouped into 31 clusters of structurally- and likely antigenically-related sequences, and fully characterized. In contrast to previous reports, we show that MASP antigenic motifs are restricted to the central and mature region of MASP polypeptides, consistent with their intracellular processing. The antigenicity of these motifs displayed significant positive correlation with their genome dosage and their relative position within the MASP polypeptide. In addition, we verified the biased genetic co-occurrence of certain antigenic motifs within MASP polypeptides, compatible with proposed intra-family recombination events underlying the evolution of their coding genes. Sequences spanning 7 MASP antigenic motifs were further evaluated using distinct synthesis/display approaches and a large panel of serum samples. Overall, the serological recognition of MASP antigenic motifs exhibited a remarkable non normal distribution among the T. cruzi seropositive population, thus reducing their applicability in conventional serodiagnosis. As previously observed in in vitro
Durante, Ignacio M; La Spina, Pablo E; Carmona, Santiago J; Agüero, Fernán; Buscaglia, Carlos A
2017-09-01
The Trypanosoma cruzi genome bears a huge family of genes and pseudogenes coding for Mucin-Associated Surface Proteins (MASPs). MASP molecules display a 'mosaic' structure, with highly conserved flanking regions and a strikingly variable central and mature domain made up of different combinations of a large repertoire of short sequence motifs. MASP molecules are highly expressed in mammal-dwelling stages of T. cruzi and may be involved in parasite-host interactions and/or in diverting the immune response. High-density microarrays composed of fully overlapped 15mer peptides spanning the entire sequences of 232 non-redundant MASPs (~25% of the total MASP content) were screened with chronic Chagasic sera. This strategy led to the identification of 86 antigenic motifs, each one likely representing a single linear B-cell epitope, which were mapped to 69 different MASPs. These motifs could be further grouped into 31 clusters of structurally- and likely antigenically-related sequences, and fully characterized. In contrast to previous reports, we show that MASP antigenic motifs are restricted to the central and mature region of MASP polypeptides, consistent with their intracellular processing. The antigenicity of these motifs displayed significant positive correlation with their genome dosage and their relative position within the MASP polypeptide. In addition, we verified the biased genetic co-occurrence of certain antigenic motifs within MASP polypeptides, compatible with proposed intra-family recombination events underlying the evolution of their coding genes. Sequences spanning 7 MASP antigenic motifs were further evaluated using distinct synthesis/display approaches and a large panel of serum samples. Overall, the serological recognition of MASP antigenic motifs exhibited a remarkable non normal distribution among the T. cruzi seropositive population, thus reducing their applicability in conventional serodiagnosis. As previously observed in in vitro and animal
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Experience Matters: Information Acquisition Optimizes Probability Gain
Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.
2010-01-01
Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915
Experience matters: information acquisition optimizes probability gain.
Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J
2010-07-01
Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.
UT Biomedical Informatics Lab (BMIL) Probability Wheel.
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Fracture probability along a fatigue crack path
Energy Technology Data Exchange (ETDEWEB)
Makris, P. [Technical Univ., Athens (Greece)
1995-03-01
Long experience has shown that the strength of materials under fatigue load has a stochastic behavior, which can be expressed through the fracture probability. This paper deals with a new analytically derived law for the distribution of the fracture probability along a fatigue crack path. The knowledge of the distribution of the fatigue fracture probability along the crack path helps the connection between stress conditions and the expected fatigue life of a structure under stochasticly varying loads. (orig.)
Probability and statistics: models for research
National Research Council Canada - National Science Library
Bailey, Daniel Edgar
1971-01-01
This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...
The probability and the management of human error
Energy Technology Data Exchange (ETDEWEB)
Dufey, R.B. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, ON (Canada); Saull, J.W. [International Federation of Airworthiness, Sussex (United Kingdom)
2004-07-01
Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error ({lambda}) that combines the influences of early inexperience, learning from experience ({epsilon}) and stochastic occurrences with having a finite minimum rate, this equation is {lambda} 5.10{sup -5} + ((1/{epsilon}) - 5.10{sup -5}) exp(-3*{epsilon}). The future failure rate is entirely determined by the experience: thus the past defines the future.
Advantages of the probability amplitude over the probability density in quantum mechanics
Kurihara, Yoshimasa; Quach, Nhi My Uyen
2013-01-01
We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...
Daum, Fred L.; Zalovcik, John A.
1946-01-01
Wing section outboard of flap was tested by wake surveys in Mach range of 0.25 - 0.78 and lift coefficient range 0.06 - 0.69. Results indicated that minimum profile-drag coefficient of 0.0097 was attained for lift coefficients from 0.16 to 0.25 at Mach less than 0.67. Below Mach number at which compressibility shock occurred, variations in Mach of 0.2 had negligible effect on profile drag coefficient. Shock was not evident until critical Mach was exceeded by 0.025.
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Analytical Study of Thermonuclear Reaction Probability Integrals
Chaudhry, M.A.; Haubold, H. J.; Mathai, A. M.
2000-01-01
An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.
Examples of Neutrosophic Probability in Physics
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
prep misestimates the probability of replication
Iverson, G.; Lee, M.D.; Wagenmakers, E.-J.
2009-01-01
The probability of "replication," prep, has been proposed as a means of identifying replicable and reliable effects in the psychological sciences. We conduct a basic test of prep that reveals that it misestimates the true probability of replication, especially for small effects. We show how these
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Recent Developments in Applied Probability and Statistics
Devroye, Luc; Kohler, Michael; Korn, Ralf
2010-01-01
This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.
Projecting Climate Change Impacts on Wildfire Probabilities
Westerling, A. L.; Bryant, B. P.; Preisler, H.
2008-12-01
We present preliminary results of the 2008 Climate Change Impact Assessment for wildfire in California, part of the second biennial science report to the California Climate Action Team organized via the California Climate Change Center by the California Energy Commission's Public Interest Energy Research Program pursuant to Executive Order S-03-05 of Governor Schwarzenegger. In order to support decision making by the State pertaining to mitigation of and adaptation to climate change and its impacts, we model wildfire occurrence monthly from 1950 to 2100 under a range of climate scenarios from the Intergovernmental Panel on Climate Change. We use six climate change models (GFDL CM2.1, NCAR PCM1, CNRM CM3, MPI ECHAM5, MIROC3.2 med, NCAR CCSM3) under two emissions scenarios--A2 (C02 850ppm max atmospheric concentration) and B1(CO2 550ppm max concentration). Climate model output has been downscaled to a 1/8 degree (~12 km) grid using two alternative methods: a Bias Correction and Spatial Donwscaling (BCSD) and a Constructed Analogues (CA) downscaling. Hydrologic variables have been simulated from temperature, precipitation, wind and radiation forcing data using the Variable Infiltration Capacity (VIC) Macroscale Hydrologic Model. We model wildfire as a function of temperature, moisture deficit, and land surface characteristics using nonlinear logistic regression techniques. Previous work on wildfire climatology and seasonal forecasting has demonstrated that these variables account for much of the inter-annual and seasonal variation in wildfire. The results of this study are monthly gridded probabilities of wildfire occurrence by fire size class, and estimates of the number of structures potentially affected by fires. In this presentation we will explore the range of modeled outcomes for wildfire in California, considering the effects of emissions scenarios, climate model sensitivities, downscaling methods, hydrologic simulations, statistical model specifications for
DEFF Research Database (Denmark)
Bottoli, Federico; Christiansen, Thomas Lundin; Winther, Grethe
2016-01-01
The present work deals with the evaluation of the residual stress profiles in expanded austenite by applying grazing incidence X-ray diffraction (GI-XRD) combined with successive sublayer removal. Annealed and deformed (εeq=0.5) samples of stable stainless steel EN 1.4369 were nitrided or nitroca...
DEFF Research Database (Denmark)
Offersgaard, Jesper Falden; Veng, Torben; Skettrup, Torben
1996-01-01
post annealing in order to test the method on both steplike and graded index profiles. The resulting characterizations of the samples are discussed in relation to the inverse WKB method. Finally, the importance of incorporating the effects of material birefringence in the characterization of these kind...
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Spiga, D
2018-01-01
X-ray mirrors with high focusing performances are commonly used in different sectors of science, such as X-ray astronomy, medical imaging and synchrotron/free-electron laser beamlines. While deformations of the mirror profile may cause degradation of the focus sharpness, a deliberate deformation of the mirror can be made to endow the focus with a desired size and distribution, via piezo actuators. The resulting profile can be characterized with suitable metrology tools and correlated with the expected optical quality via a wavefront propagation code or, sometimes, predicted using geometric optics. In the latter case and for the special class of profile deformations with monotonically increasing derivative, i.e. concave upwards, the point spread function (PSF) can even be predicted analytically. Moreover, under these assumptions, the relation can also be reversed: from the desired PSF the required profile deformation can be computed analytically, avoiding the use of trial-and-error search codes. However, the computation has been so far limited to geometric optics, which entailed some limitations: for example, mirror diffraction effects and the size of the coherent X-ray source were not considered. In this paper, the beam-shaping formalism in the framework of physical optics is reviewed, in the limit of small light wavelengths and in the case of Gaussian intensity wavefronts. Some examples of shaped profiles are also shown, aiming at turning a Gaussian intensity distribution into a top-hat one, and checks of the shaping performances computing the at-wavelength PSF by means of the WISE code are made.
CSIR Research Space (South Africa)
Jafta, CJ
2011-07-01
Full Text Available Previous experimental investigations have only shown, without explanation, that the pre-exponential factor (D0), in the diffusion coefficient of Sb segregating in Cu, is dependent on the surface orientation of a crystal. In this study, the surface...
deJong, RS
The stellar and dust content of spiral galaxies as function of radius has been investigated using near-infrared and optical broadband surface photometry of 86 face-on spiral galaxies. Colors of galaxies correlate with the azimuthally averaged local surface brightness both within and among galaxies,
Young, Stuart A.; Josset, Damien B.; Vaughan, Mark A.
2010-01-01
CALIPSO's (Cloud Aerosol Lidar Infrared Pathfinder Satellite Observations) analysis algorithms generally require the use of tabulated values of the lidar ratio in order to retrieve aerosol extinction and optical depth from measured profiles of attenuated backscatter. However, for any given time or location, the lidar ratio for a given aerosol type can differ from the tabulated value. To gain some insight as to the extent of the variability, we here calculate the lidar ratio for dust aerosols using aerosol optical depth constraints from two sources. Daytime measurements are constrained using Level 2, Collection 5, 550-nm aerosol optical depth measurements made over the ocean by the MODIS (Moderate Resolution Imaging Spectroradiometer) on board the Aqua satellite, which flies in formation with CALIPSO. We also retrieve lidar ratios from night-time profiles constrained by aerosol column optical depths obtained by analysis of CALIPSO and CloudSat backscatter signals from the ocean surface.
Energy Technology Data Exchange (ETDEWEB)
Aureau, D., E-mail: damien.aureau@uvsq.fr [Institut Lavoisier de Versailles, (UMR 8180) Université de Versailles-Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France); Ridier, K. [Institut Lavoisier de Versailles, (UMR 8180) Université de Versailles-Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France); Groupe d' Étude de la Matière Condensée (UMR 8635) Université de Versailles Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France); Bérini, B.; Dumont, Y.; Keller, N. [Groupe d' Étude de la Matière Condensée (UMR 8635) Université de Versailles Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France); Vigneron, J.; Bouttemy, M.; Etcheberry, A. [Institut Lavoisier de Versailles, (UMR 8180) Université de Versailles-Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France); Fouchet, A. [Groupe d' Étude de la Matière Condensée (UMR 8635) Université de Versailles Saint-Quentin-en-Yvelines–CNRS, 45 Av. des États-Unis, 78035 Versailles (France)
2016-02-29
This article shows the comparison between three different ionic bombardments during X-ray photoelectron spectroscopy (XPS) studies of single crystalline SrTiO{sub 3} (STO) substrates. The abrasion using a “cluster argon ion source” is compared with the standard “monoatomic Ar”. The influence of the energy of the monoatomic ions used is clearly demonstrated. While the chemically adsorbed species on the STO surface are removed, such bombardment strongly modifies the surface. A reduction of part of the titanium atoms and the appearance of a different chemical environment for surface strontium atoms are observed. Implantation of argon ions is also detected. Cluster ion etching is used on oxide surface and, in this case only, due to a much lower kinetic energy per atom compared to monoatomic ions, the possibility to remove surface contaminants at the surface without modification of the XP spectra is clearly demonstrated, ensuring that the stoichiometry of the surface is preserved. Such result is crucial for everybody working with oxide surfaces to obtain a non-modified XPS analysis. The progressive effect of this powerful tool allows the monitoring of the removal of surface contamination in the first steps of the bombardment which was not achievable with usual guns. - Highlights: • The effects of three argon etchings are studied as a function of time on SrTiO3 oxide. • A method for obtaining non-modified chemical analysis of oxides is presented. • The soft removal of adsorbed species thanks to argon cluster is demonstrated. • The damages induced on SrTiO3 surface by ionic bombardment are shown. • The influence of the kinetic energy of incoming Ar atoms is examined.
Upgrading Probability via Fractions of Events
Directory of Open Access Journals (Sweden)
Frič Roman
2016-08-01
Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.
Robust Model-Free Multiclass Probability Estimation
Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng
2010-01-01
Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386
Uncertainty about probability: a decision analysis perspective
Energy Technology Data Exchange (ETDEWEB)
Howard, R.A.
1988-03-01
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.
A Course on Elementary Probability Theory
Lo, Gane Samb
2017-01-01
This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related t...
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...
Hladíková, Radka
2010-01-01
Title: Data Profiling Author: Radka Hladíková Department: Department of Software Engineering Supervisor: Ing. Vladimír Kyjonka Supervisor's e-mail address: Abstract: This thesis puts mind on problems with data quality and data profiling. This Work analyses and summarizes problems of data quality, data defects, process of data quality, data quality assessment and data profiling. The main topic is data profiling as a process of researching data available in existing...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0115173 includes chemical, discrete sample, meteorological, physical, profile and underway - surface data collected from METEOR in the South Atlantic...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from surface sensors and CTD casts in the Gulf of Alaska from NOAA Ship MILLER FREEMAN from 17 April 1990 to 11 October 1990....
National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0157461 includes Surface underway, chemical, discrete sample, meteorological, physical and profile data collected from PELICAN in the Coastal Waters...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile and other data were collected from surface sensors and CTD casts from NOAA Ship MILLER FREEMAN and other platforms from 31 January 1988 to 23...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from surface sensors, bottle casts, and CTD casts in the Bering Sea from the R/V ALPHA HELIX from 21 April 1988 to 20 May...