WorldWideScience

Sample records for high open probability

  1. Probability in High Dimension

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  2. Some open problems in noncommutative probability

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  3. Sill intrusion in volcanic calderas: implications for vent opening probability

    Giudicepietro, Flora; Macedonio, Giovanni; Martini, Marcello; D'Auria, Luca

    2017-04-01

    Calderas show peculiar behaviors with remarkable dynamic processes, which do not often culminate in eruptions. Observations and studies conducted in recent decades have shown that the most common cause of unrest in the calderas is due to magma intrusion; in particular, the intrusion of sills at shallow depths. Monogenic cones, with large areal dispersion, are quite common in the calderas, suggesting that the susceptibility analysis based on geological features, is not strictly suitable for estimating the vent opening probability in calderas. In general, the opening of a new eruptive vent can be regarded as a rock failure process. The stress field in the rocks that surrounds and tops the magmatic reservoirs plays an important role in causing the rock failure and creating the path that magma can follow towards the surface. In this conceptual framework, we approach the problem of getting clues about the probability of vent opening in volcanic calderas through the study of the stress field produced by the intrusion of magma, in particular, by the intrusion of a sill. We simulate the intrusion of a sill free to expand radially, with shape and dimensions which vary with time. The intrusion process is controlled by the elastic response of the rock plate above the sill, which bends because of the intrusion, and by gravity, that drives the magma towards the zones where the thickness of the sill is smaller. We calculated the stress field in the plate rock above the sill. We found that at the bottom of the rock plate above the sill the maximum intensity of tensile stress is concentrated at the front of the sill and spreads radially with it, over time. For this reason, we think that the front of the spreading sill is prone to open for eruptive vents. Even in the central area of the sill the intensity of stress is relatively high, but at the base of the rock plate stress is compressive. Under isothermal conditions, the stress soon reaches its maximum value (time interval

  4. High throughput nonparametric probability density estimation.

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  5. Probability

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  6. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  7. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    De Gregorio, Sofia; Camarda, Marco

    2016-07-26

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  8. Focus in High School Mathematics: Statistics and Probability

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  9. Domestic wells have high probability of pumping septic tank leachate

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  10. Domestic wells have high probability of pumping septic tank leachate

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  11. High probability of disease in angina pectoris patients

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  12. Targets of DNA-binding proteins in bacterial promoter regions present enhanced probabilities for spontaneous thermal openings

    Apostolaki, Angeliki; Kalosakas, George

    2011-01-01

    We mapped promoter regions of double-stranded DNA with respect to the probabilities of appearance of relatively large bubble openings exclusively due to thermal fluctuations at physiological temperatures. We analyzed five well-studied promoter regions of procaryotic type and found a spatial correlation between the binding sites of transcription factors and the position of peaks in the probability pattern of large thermal openings. Other distinct peaks of the calculated patterns correlate with potential binding sites of DNA-binding proteins. These results suggest that a DNA molecule would more frequently expose the bases that participate in contacts with proteins, which would probably enhance the probability of the latter to reach their targets. It also stands for using this method as a means to analyze DNA sequences based on their intrinsic thermal properties

  13. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  14. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  15. A High Five for ChemistryOpen.

    Peralta, David; Ortúzar, Natalia

    2016-02-01

    Fabulous at five! When ChemistryOpen was launched in 2011, it was the first society-owned general chemistry journal to publish open-access articles exclusively. Five years down the line, it has featured excellent work in all fields of chemistry, leading to an impressive first full impact factor of 3.25. In this Editorial, read about how ChemistryOpen has grown over the past five years and made its mark as a high-quality open-access journal with impact.

  16. Neutron emission probability at high excitation and isospin

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  17. High-resolution urban flood modelling - a joint probability approach

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  18. Varicocoelectomy in adolescents: Laparoscopic versus open high ...

    Background: Treatment of varicocoele is aimed at eliminating the retrograde reflux of venous blood through the internal spermatic veins. The purpose of this investigation was to compare laparoscopic varicocoelectomy (LV) with open high ligation technique in the adolescent population. Materials and Methods: We ...

  19. High-resolution elastic recoil detection utilizing Bayesian probability theory

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  20. Numerical experiments on the probability of seepage into underground openings in heterogeneous fractured rock

    Birkholzer, J.; Li, G.; Tsang, C.F.; Tsang, Y.

    1998-01-01

    An important issue for the performance of underground nuclear waste repositories is the rate of seepage into the waste emplacement drifts. A prediction of this rate is particularly complicated for the potential repository site at Yucca Mountain, Nevada, because it is located in thick, unsaturated, fractured tuff formations. Underground opening in unsaturated media might act as capillary barriers, diverting water around them. In the present work, they study the potential rate of seepage into drifts as a function of the percolation flux at Yucca Mountain, based on a stochastic model of the fractured rock mass in the drift vicinity. A variety of flow scenarios are considered, assuming present-day and possible future climate conditions. They show that the heterogeneity in the flow domain is a key factor controlling seepage rates, since it causes channelized flow and local ponding in the unsaturated flow field

  1. Probability based high temperature engineering creep and structural fire resistance

    Razdolsky, Leo

    2017-01-01

    This volume on structural fire resistance is for aerospace, structural, and fire prevention engineers; architects, and educators. It bridges the gap between prescriptive- and performance-based methods and simplifies very complex and comprehensive computer analyses to the point that the structural fire resistance and high temperature creep deformations will have a simple, approximate analytical expression that can be used in structural analysis and design. The book emphasizes methods of the theory of engineering creep (stress-strain diagrams) and mathematical operations quite distinct from those of solid mechanics absent high-temperature creep deformations, in particular the classical theory of elasticity and structural engineering. Dr. Razdolsky’s previous books focused on methods of computing the ultimate structural design load to the different fire scenarios. The current work is devoted to the computing of the estimated ultimate resistance of the structure taking into account the effect of high temperatur...

  2. How to Recognize and Avoid Potential, Possible, or Probable Predatory Open-Access Publishers, Standalone, and Hijacked Journals.

    Danevska, Lenche; Spiroski, Mirko; Donev, Doncho; Pop-Jordanova, Nada; Polenakovic, Momir

    2016-11-01

    The Internet has enabled an easy method to search through the vast majority of publications and has improved the impact of scholarly journals. However, it can also pose threats to the quality of published articles. New publishers and journals have emerged so-called open-access potential, possible, or probable predatory publishers and journals, and so-called hijacked journals. It was our aim to increase the awareness and warn scholars, especially young researchers, how to recognize these journals and how to avoid submission of their papers to these journals. Review and critical analysis of the relevant published literature, Internet sources and personal experience, thoughts, and observations of the authors. The web blog of Jeffrey Beall, University of Colorado, was greatly consulted. Jeffrey Beall is a Denver academic librarian who regularly maintains two lists: the first one, of potential, possible, or probable predatory publishers and the second one, of potential, possible, or probable predatory standalone journals. Aspects related to this topic presented by other authors have been discussed as well. Academics should bear in mind how to differentiate between trustworthy and reliable journals and predatory ones, considering: publication ethics, peer-review process, international academic standards, indexing and abstracting, preservation in digital repositories, metrics, sustainability, etc.

  3. High probability of comorbidities in bronchial asthma in Germany.

    Heck, S; Al-Shobash, S; Rapp, D; Le, D D; Omlor, A; Bekhit, A; Flaig, M; Al-Kadah, B; Herian, W; Bals, R; Wagenpfeil, S; Dinh, Q T

    2017-04-21

    Clinical experience has shown that allergic and non-allergic respiratory, metabolic, mental, and cardiovascular disorders sometimes coexist with bronchial asthma. However, no study has been carried out that calculates the chance of manifestation of these disorders with bronchial asthma in Saarland and Rhineland-Palatinate, Germany. Using ICD10 diagnoses from health care institutions, the present study systematically analyzed the co-prevalence and odds ratios of comorbidities in the asthma population in Germany. The odds ratios were adjusted for age and sex for all comorbidities for patients with asthma vs. without asthma. Bronchial asthma was strongly associated with allergic and with a lesser extent to non-allergic comorbidities: OR 7.02 (95%CI:6.83-7.22) for allergic rhinitis; OR 4.98 (95%CI:4.67-5.32) allergic conjunctivitis; OR 2.41 (95%CI:2.33-2.52) atopic dermatitis; OR 2.47 (95%CI:2.16-2.82) food allergy, and OR 1.69 (95%CI:1.61-1.78) drug allergy. Interestingly, increased ORs were found for respiratory diseases: 2.06 (95%CI:1.64-2.58) vocal dysfunction; 1.83 (95%CI:1.74-1.92) pneumonia; 1.78 (95%CI:1.73-1.84) sinusitis; 1.71 (95%CI:1.65-1.78) rhinopharyngitis; 2.55 (95%CI:2.03-3.19) obstructive sleep apnea; 1.42 (95%CI:1.25-1.61) pulmonary embolism, and 3.75 (95%CI:1.64-8.53) bronchopulmonary aspergillosis. Asthmatics also suffer from psychiatric, metabolic, cardiac or other comorbidities. Myocardial infarction (OR 0.86, 95%CI:0.79-0.94) did not coexist with asthma. Based on the calculated chances of manifestation for these comorbidities, especially allergic and respiratory, to a lesser extent also metabolic, cardiovascular, and mental disorders should be taken into consideration in the diagnostic and treatment strategy of bronchial asthma. PREVALENCE OF CO-EXISTING DISEASES IN GERMANY: Patients in Germany with bronchial asthma are highly likely to suffer from co-existing diseases and their treatments should reflect this. Quoc Thai Dinh at Saarland

  4. Opening the high-energy frontier

    Quigg, C.

    1988-12-01

    I review the scientific motivation for an experimental assault on the 1-TeV scale, elaborating the idea of technicolor as one interesting possibility for what may be found there. I then summarize some of the discovery possibilities opened by a high-luminosity, multi-TeV proton-proton collider. After a brief resume of the experimental environment anticipated at the SSC, I report on the status of the SSC R ampersand D effort and discuss the work to be carried out over the course of the next year. 37 refs., 10 figs., 1 tab

  5. Resveratrol enhances airway surface liquid depth in sinonasal epithelium by increasing cystic fibrosis transmembrane conductance regulator open probability.

    Shaoyan Zhang

    Full Text Available Chronic rhinosinusitis engenders enormous morbidity in the general population, and is often refractory to medical intervention. Compounds that augment mucociliary clearance in airway epithelia represent a novel treatment strategy for diseases of mucus stasis. A dominant fluid and electrolyte secretory pathway in the nasal airways is governed by the cystic fibrosis transmembrane conductance regulator (CFTR. The objectives of the present study were to test resveratrol, a strong potentiator of CFTR channel open probability, in preparation for a clinical trial of mucociliary activators in human sinus disease.Primary sinonasal epithelial cells, immortalized bronchoepithelial cells (wild type and F508del CFTR, and HEK293 cells expressing exogenous human CFTR were investigated by Ussing chamber as well as patch clamp technique under non-phosphorylating conditions. Effects on airway surface liquid depth were measured using confocal laser scanning microscopy. Impact on CFTR gene expression was measured by quantitative reverse transcriptase polymerase chain reaction.Resveratrol is a robust CFTR channel potentiator in numerous mammalian species. The compound also activated temperature corrected F508del CFTR and enhanced CFTR-dependent chloride secretion in human sinus epithelium ex vivo to an extent comparable to the recently approved CFTR potentiator, ivacaftor. Using inside out patches from apical membranes of murine cells, resveratrol stimulated an ~8 picosiemens chloride channel consistent with CFTR. This observation was confirmed in HEK293 cells expressing exogenous CFTR. Treatment of sinonasal epithelium resulted in a significant increase in airway surface liquid depth (in µm: 8.08+/-1.68 vs. 6.11+/-0.47,control,p<0.05. There was no increase CFTR mRNA.Resveratrol is a potent chloride secretagogue from the mucosal surface of sinonasal epithelium, and hydrates airway surface liquid by increasing CFTR channel open probability. The foundation for a

  6. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  7. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  8. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  9. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  10. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  11. Interpretation of highly visual 'open' advertisements in Dutch magazines

    Ketelaar, P.E.; Gisbergen, M.S.; Beentjes, J.

    2012-01-01

    In recent decades magazine advertisers have used an increasing number of highly visual open ads. Open ads do not guide consumers toward a specific interpretation as traditional ads do. An experiment was carried out to establish the effects of openness on interpretation. As expected, openness was

  12. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  13. High explosive driven plasma opening switches

    Greene, A.E.; Bowers, R.L.; Brownell, J.H.; Goforth, J.H.; Oliphant, T.A.; Weiss, D.L.

    1983-01-01

    A joint theoretical and experimental effort is underway to understand and improve upon the performance of high explosive driven plasma opening switches such as those first described by Pavlovskii et al. We have modeled these switches in both planar and cylindrical geometry using a one dimensional Lagrangian MHD code. This one-dimensional analysis is now essentially complete. It has shown that simple, one-dimensional, compression of the current-carrying channel can explain the observed resistance increases during the time of flight of the HE detonation products. Our calculations imply that ionization plays an important role as an energy sink and the performance of these switches might be improved by a judicious choice of gases. We also predict improved performance by lowering the pressure in the plasma channel. The bulk of our experimental effort to date has been with planar switches. We have worked with current densities of 0.25 to 0.4 MA/cm and have observed resistance increases of 40 to 60 mΩ. Significant resistance increases are observed later than the time of flight of the HE detonation products. We suggest that these resistance increases are due to mixing between the hot plasma and the relatively cooler detonation products. Such mixing is not included in the 1-D, Lagrangian code. We are presently beginning a computational effort with a 2-D Eulerian code. The status of this effort is discussed. Experimentally we have designed an apparatus that will permit us to test the role of different gases and pressures. This system is also in a planar geometry, but the plasma channel is doughnut shaped, permitting us to avoid edge effects associated with the planar rectangular geometry. The first experiments with this design are quite encouraging and the status of this effort is also discussed

  14. Innovative and high quality education through Open Education and OER

    Stracke, Christian M.

    2017-01-01

    Online presentation and webinar by Stracke, C. M. (2017, 18 December) on "Innovative and high quality education through Open Education and OER" for the Belt and Road Open Education Learning Week by the Beijing Normal University, China.

  15. Decomposition of conditional probability for high-order symbolic Markov chains

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  16. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  17. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  18. Mining of high utility-probability sequential patterns from uncertain databases.

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  19. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  20. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  1. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  2. High frequency response of open quantum dots

    Brunner, R.; Meisels, R.; Kuchar, F.; Ferry, D.; Elhassan, M.; Ishibashi, K.

    2002-01-01

    Full text: We investigate the response of the transport through open quantum dots to millimeterwave radiation (up to 55 GHz). In the low-field region ( 11 cm -2 and a mobility of 1.2 10 6 cm 2 /Vs. By applying a sufficiently negative voltage to the gates the 2DES is split into two regions connected only by a dot-like region (about 350 nm diameter) between them. The DC data exhibit backscattering peaks at fields of a few tenth of a Tesla. Shubnikovde- Haas (SdH) oscillations appear above 0.5 T. While the SdH oscillations show the usual temperature dependence, the backscattering peaks are temperature independent up to 2.5 K. The backscattering peak shows a reduction of 10 percent due to the millimeterwave irradiation. However, due to the temperature independence of this peak, this reduction cannot simply be attributed to electron heating. This conclusion is supported by the observation of a strong frequency dependence of the reduction of the peak height. (author)

  3. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  4. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  5. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  6. High pressure photoinduced ring opening of benzene

    Ciabini, Lucia; Santoro, Mario; Bini, Roberto; Schettino, Vincenzo

    2002-01-01

    The chemical transformation of crystalline benzene into an amorphous solid (a-C:H) was induced at high pressure by employing laser light of suitable wavelengths. The reaction was forced to occur at 16 GPa, well below the pressure value (23 GPa) where the reaction normally occurs. Different laser sources were used to tune the pumping wavelength into the red wing of the first excited singlet state S 1 ( 1 B 2u ) absorption edge. Here the benzene ring is distorted, presenting a greater flexibility which makes the molecule unstable at high pressure. The selective pumping of the S 1 level, in addition to structural considerations, was of paramount importance to clarify the mechanism of the reaction

  7. Dynamic Open Inquiry Performances of High-School Biology Students

    Zion, Michal; Sadeh, Irit

    2010-01-01

    In examining open inquiry projects among high-school biology students, we found dynamic inquiry performances expressed in two criteria: "changes occurring during inquiry" and "procedural understanding". Characterizing performances in a dynamic open inquiry project can shed light on both the procedural and epistemological…

  8. A prototype method for diagnosing high ice water content probability using satellite imager data

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  9. Open cell conducting foams for high synchrotron radiation accelerators

    S. Petracca

    2014-08-01

    Full Text Available The possible use of open cell conductive foams in high synchrotron radiation particle accelerators is considered. Available materials and modeling tools are reviewed, potential pros and cons are discussed, and preliminary conclusions are drawn.

  10. High Quality Education and Learning for All through Open Education

    Stracke, Christian M.

    2016-01-01

    Keynote at the International Lensky Education Forum 2016, Yakutsk, Republic of Sakha, Russian Federation, by Stracke, C. M. (2016, 16 August): "High Quality Education and Learning for All through Open Education"

  11. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  12. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  13. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  14. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  15. Long-term survival in laparoscopic vs open resection for colorectal liver metastases: inverse probability of treatment weighting using propensity scores.

    Lewin, Joel W; O'Rourke, Nicholas A; Chiow, Adrian K H; Bryant, Richard; Martin, Ian; Nathanson, Leslie K; Cavallucci, David J

    2016-02-01

    This study compares long-term outcomes between intention-to-treat laparoscopic and open approaches to colorectal liver metastases (CLM), using inverse probability of treatment weighting (IPTW) based on propensity scores to control for selection bias. Patients undergoing liver resection for CLM by 5 surgeons at 3 institutions from 2000 to early 2014 were analysed. IPTW based on propensity scores were generated and used to assess the marginal treatment effect of the laparoscopic approach via a weighted Cox proportional hazards model. A total of 298 operations were performed in 256 patients. 7 patients with planned two-stage resections were excluded leaving 284 operations in 249 patients for analysis. After IPTW, the population was well balanced. With a median follow up of 36 months, 5-year overall survival (OS) and recurrence-free survival (RFS) for the cohort were 59% and 38%. 146 laparoscopic procedures were performed in 140 patients, with weighted 5-year OS and RFS of 54% and 36% respectively. In the open group, 138 procedures were performed in 122 patients, with a weighted 5-year OS and RFS of 63% and 38% respectively. There was no significant difference between the two groups in terms of OS or RFS. In the Brisbane experience, after accounting for bias in treatment assignment, long term survival after LLR for CLM is equivalent to outcomes in open surgery. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  16. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  17. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  18. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  19. Maladaptively high and low openness: the case for experiential permeability.

    Piedmont, Ralph L; Sherman, Martin F; Sherman, Nancy C

    2012-12-01

    The domain of Openness within the Five-Factor Model (FFM) has received inconsistent support as a source for maladaptive personality functioning, at least when the latter is confined to the disorders of personality included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR; APA, ). However, an advantage of the FFM relative to the DSM-IV-TR is that the former was developed to provide a reasonably comprehensive description of general personality structure. Rather than suggest that the FFM is inadequate because the DSM-IV-TR lacks much representation of Openness, it might be just as reasonable to suggest that the DSM-IV-TR is inadequate because it lacks an adequate representation of maladaptive variants of both high and low Openness. This article discusses the development and validation of a measure of these maladaptive variants, the Experiential Permeability Inventory. © 2012 The Authors. Journal of Personality © 2012, Wiley Periodicals, Inc.

  20. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  1. Skin damage probabilities using fixation materials in high-energy photon beams

    Carl, J.; Vestergaard, A.

    2000-01-01

    Patient fixation, such as thermoplastic masks, carbon-fibre support plates and polystyrene bead vacuum cradles, is used to reproduce patient positioning in radiotherapy. Consequently low-density materials may be introduced in high-energy photon beams. The aim of the this study was to measure the increase in skin dose when low-density materials are present and calculate the radiobiological consequences in terms of probabilities of early and late skin damage. An experimental thin-windowed plane-parallel ion chamber was used. Skin doses were measured using various overlaying low-density fixation materials. A fixed geometry of a 10 x 10 cm field, a SSD = 100 cm and photon energies of 4, 6 and 10 MV on Varian Clinac 2100C accelerators were used for all measurements. Radiobiological consequences of introducing these materials into the high-energy photon beams were evaluated in terms of early and late damage of the skin based on the measured surface doses and the LQ-model. The experimental ion chamber save results consistent with other studies. A relationship between skin dose and material thickness in mg/cm 2 was established and used to calculate skin doses in scenarios assuming radiotherapy treatment with opposed fields. Conventional radiotherapy may apply mid-point doses up to 60-66 Gy in daily 2-Gy fractions opposed fields. Using thermoplastic fixation and high-energy photons as low as 4 MV do increase the dose to the skin considerably. However, using thermoplastic materials with thickness less than 100 mg/cm 2 skin doses are comparable with those produced by variation in source to skin distance, field size or blocking trays within clinical treatment set-ups. The use of polystyrene cradles and carbon-fibre materials with thickness less than 100 mg/cm 2 should be avoided at 4 MV at doses above 54-60 Gy. (author)

  2. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-01

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper, we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E. The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. `explore or not?'; `open new well or not?'; `contaminated by water or not?'; `double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism). This article is part of the theme issue `Hilbert's sixth problem'.

  3. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems.

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-28

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  4. Jihadist Foreign Fighter Phenomenon in Western Europe: A Low-Probability, High-Impact Threat

    Edwin Bakker

    2015-11-01

    Full Text Available The phenomenon of foreign fighters in Syria and Iraq is making headlines. Their involvement in the atrocities committed by terrorist groups such as the so-called “Islamic State” and Jabhat al-Nusra have caused grave concern and public outcry in the foreign fighters’ European countries of origin. While much has been written about these foreign fighters and the possible threat they pose, the impact of this phenomenon on Western European societies has yet to be documented. This Research Paper explores four particular areas where this impact is most visible: a violent incidents associated with (returned foreign fighters, b official and political responses linked to these incidents, c public opinion, and d anti-Islam reactions linked to these incidents. The authors conclude that the phenomenon of jihadist foreign fighters in European societies should be primarily regarded as a social and political threat, not a physical one. They consider the phenomenon of European jihadist foreign fighters a “low-probability, high-impact” threat.

  5. Probable high prevalence of limb-girdle muscular dystrophy type 2D in Taiwan.

    Liang, Wen-Chen; Chou, Po-Ching; Hung, Chia-Cheng; Su, Yi-Ning; Kan, Tsu-Min; Chen, Wan-Zi; Hayashi, Yukiko K; Nishino, Ichizo; Jong, Yuh-Jyh

    2016-03-15

    Limb-girdle muscular dystrophy type 2D (LGMD2D), an autosomal-recessive inherited LGMD, is caused by the mutations in SGCA. SGCA encodes alpha-sarcoglycan (SG) that forms a heterotetramer with other SGs in the sarcolemma, and comprises part of the dystrophin-glycoprotein complex. The frequency of LGMD2D is variable among different ethnic backgrounds, and so far only a few patients have been reported in Asia. We identified five patients with a novel homozygous mutation of c.101G>T (p.Arg34Leu) in SGCA from a big aboriginal family ethnically consisting of two tribes in Taiwan. Patient 3 is the maternal uncle of patients 1 and 2. All their parents, heterozygous for c.101G>T, denied consanguineous marriages although they were from the same tribe. The heterozygous parents of patients 4 and 5 were from two different tribes, originally residing in different geographic regions in Taiwan. Haplotype analysis showed that all five patients shared the same mutation-associated haplotype, indicating the probability of a founder effect and consanguinity. The results suggest that the carrier rate of c.101G>T in SGCA may be high in Taiwan, especially in the aboriginal population regardless of the tribes. It is important to investigate the prevalence of LGMD2D in Taiwan for early diagnosis and treatment. Copyright © 2016. Published by Elsevier B.V.

  6. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  7. A high open-circuit voltage gallium nitride betavoltaic microbattery

    Cheng, Zaijun; Chen, Xuyuan; San, Haisheng; Feng, Zhihong; Liu, Bo

    2012-01-01

    A high open-circuit voltage betavoltaic microbattery based on a gallium nitride (GaN) p–i–n homojunction is demonstrated. As a beta-absorbing layer, the low electron concentration of the n-type GaN layer is achieved by the process of Fe compensation doping. Under the irradiation of a planar solid 63 Ni source with activity of 0.5 mCi, the open-circuit voltage of the fabricated microbattery with 2 × 2 mm 2 area reaches as much as 1.64 V, which is the record value reported for betavoltaic batteries with 63 Ni source, the short-circuit current was measured as 568 pA and the conversion effective of 0.98% was obtained. The experimental results suggest that GaN is a high-potential candidate for developing the betavoltaic microbattery. (paper)

  8. Learning difficulties of senior high school students based on probability understanding levels

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  9. Open tube guideway for high speed air cushioned vehicles

    Goering, R. S. (Inventor)

    1974-01-01

    This invention is a tubular shaped guideway for high-speed air-cushioned supported vehicles. The tubular guideway is split and separated such that the sides of the guideway are open. The upper portion of the tubular guideway is supported above the lower portion by truss-like structural members. The lower portion of the tubular guideway may be supported by the terrain over which the vehicle travels, on pedestals or some similar structure.

  10. The opening of a high care hostel for problem drinkers.

    Bretherton, H

    1992-12-01

    This paper gives a personal and practice based account by one of the Team Leaders of the opening of a high-care hostel for problem drinkers in North London. The hostel, Rugby House, was set up to provide detoxification and assessment facilities for thirteen residents. It was part of the Rugby House Project, an alcohol agency in the voluntary sector. The paper explores the processes involved in setting up a new project; how the new paid employees turn a committee's vision into practice; how a group of individuals become a team; the importance of clarity about boundaries and underlying values and assumptions; the need for openness about negative as well as positive feelings; and the recognition that some of the experiences of staff will resonate with those of the residents for whom giving up drinking is a major life change.

  11. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  12. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  13. The Post-Embargo Open Access Citation Advantage: It Exists (Probably), Its Modest (Usually), and the Rich Get Richer (of Course).

    Ottaviani, Jim

    2016-01-01

    Many studies show that open access (OA) articles-articles from scholarly journals made freely available to readers without requiring subscription fees-are downloaded, and presumably read, more often than closed access/subscription-only articles. Assertions that OA articles are also cited more often generate more controversy. Confounding factors (authors may self-select only the best articles to make OA; absence of an appropriate control group of non-OA articles with which to compare citation figures; conflation of pre-publication vs. published/publisher versions of articles, etc.) make demonstrating a real citation difference difficult. This study addresses those factors and shows that an open access citation advantage as high as 19% exists, even when articles are embargoed during some or all of their prime citation years. Not surprisingly, better (defined as above median) articles gain more when made OA.

  14. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  15. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    Genton, Marc G.; Keyes, David E.; Turkiyyah, George

    2017-01-01

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  16. Highly enhanced avalanche probability using sinusoidally-gated silicon avalanche photodiode

    Suzuki, Shingo; Namekata, Naoto, E-mail: nnao@phys.cst.nihon-u.ac.jp; Inoue, Shuichiro [Institute of Quantum Science, Nihon University, 1-8-14 Kanda-Surugadai, Chiyoda-ku, Tokyo 101-8308 (Japan); Tsujino, Kenji [Tokyo Women' s Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo 162-8666 (Japan)

    2014-01-27

    We report on visible light single photon detection using a sinusoidally-gated silicon avalanche photodiode. Detection efficiency of 70.6% was achieved at a wavelength of 520 nm when an electrically cooled silicon avalanche photodiode with a quantum efficiency of 72.4% was used, which implies that a photo-excited single charge carrier in a silicon avalanche photodiode can trigger a detectable avalanche (charge) signal with a probability of 97.6%.

  17. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  18. Highly reversible open framework nanoscale electrodes for divalent ion batteries.

    Wang, Richard Y; Wessells, Colin D; Huggins, Robert A; Cui, Yi

    2013-01-01

    The reversible insertion of monovalent ions such as lithium into electrode materials has enabled the development of rechargeable batteries with high energy density. Reversible insertion of divalent ions such as magnesium would allow the creation of new battery chemistries that are potentially safer and cheaper than lithium-based batteries. Here we report that nanomaterials in the Prussian Blue family of open framework materials, such as nickel hexacyanoferrate, allow for the reversible insertion of aqueous alkaline earth divalent ions, including Mg(2+), Ca(2+), Sr(2+), and Ba(2+). We show unprecedented long cycle life and high rate performance for divalent ion insertion. Our results represent a step forward and pave the way for future development in divalent batteries.

  19. Ethoscopes: An open platform for high-throughput ethomics.

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  20. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  1. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  2. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  3. OPEN AIR DEMOLITION OF FACILITIES HIGHLY CONTAMINATED WITH PLUTONIUM

    LLOYD, E.R.

    2007-01-01

    The demolition of highly contaminated plutonium buildings usually is a long and expensive process that involves decontaminating the building to near free- release standards and then using conventional methods to remove the structure. It doesn't, however, have to be that way. Fluor has torn down buildings highly contaminated with plutonium without excessive decontamination. By removing the select source term and fixing the remaining contamination on the walls, ceilings, floors, and equipment surfaces; open-air demolition is not only feasible, but it can be done cheaper, better (safer), and faster. Open-air demolition techniques were used to demolish two highly contaminated buildings to slab-on-grade. These facilities on the Department of Energy's Hanford Site were located in, or very near, compounds of operating nuclear facilities that housed hundreds of people working on a daily basis. To keep the facilities operating and the personnel safe, the projects had to be creative in demolishing the structures. Several key techniques were used to control contamination and keep it within the confines of the demolition area: spraying fixatives before demolition; applying fixative and misting with a fine spray of water as the buildings were being taken down; and demolishing the buildings in a controlled and methodical manner. In addition, detailed air-dispersion modeling was done to establish necessary building and meteorological conditions and to confirm the adequacy of the proposed methods. Both demolition projects were accomplished without any spread of contamination outside the modest buffer areas established for contamination control. Furthermore, personnel exposure to radiological and physical hazards was significantly reduced by using heavy equipment rather than ''hands on'' techniques

  4. Open Access Publishing in High-Energy Physics

    Mele, S

    2007-01-01

    The goal of Open Access (OA) is to grant anyone, anywhere and anytime free access to the results of scientific research. The High- Energy Physics (HEP) community has pioneered OA with its "pre-print culture": the mass mailing, first, and the online posting, later, of preliminary versions of its articles. After almost half a century of widespread dissemination of pre-prints, the time is ripe for the HEP community to explore OA publishing. Among other possible models, a sponsoring consortium appears as the most viable option for a transition of HEP peer-reviewed literature to OA. A Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is proposed as a central body which would remunerate publishers for the peer-review service, effectively replacing the "reader-pays" model of traditional subscriptions with an "author-side" funding. Funding to SCOAP3 would come from HEP funding agencies and library consortia through a re-direction of subscriptions. This model is discussed in details togethe...

  5. Comparison of the diagnostic ability of Moorfield’s regression analysis and glaucoma probability score using Heidelberg retinal tomograph III in eyes with primary open angle glaucoma

    Jindal, Shveta; Dada, Tanuj; Sreenivas, V; Gupta, Viney; Sihota, Ramanjit; Panda, Anita

    2010-01-01

    Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT) glaucoma probability score (GPS) with that of Moorfield’s regression analysis (MRA). Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k) for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 – 0.315). The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives) and least specific criteria (borderline results included as test positives). The MRA sensitivity and specificity were 30.61 and 98% (most specific) and 57.14 and 98% (least specific). The GPS sensitivity and specificity were 81.63 and 73.47% (most specific) and 95.92 and 34.69% (least specific). The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08) and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44).The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs. PMID:20952832

  6. Comparison of the diagnostic ability of Moorfield′s regression analysis and glaucoma probability score using Heidelberg retinal tomograph III in eyes with primary open angle glaucoma

    Jindal Shveta

    2010-01-01

    Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.

  7. Probability of detection - Comparative study of computed and film radiography for high-energy applications

    Venkatachalam, R.; Venugopal, M.; Prasad, T.

    2007-01-01

    Full text of publication follows: Suitability of computed radiography with Ir-192, Co-60 and up to 9 MeV x-rays for weld inspections is of importance to many heavy engineering and aerospace industries. CR is preferred because of lesser exposure and processing time as compared to film based radiography and also digital images offers other advantages such as image enhancements, quantitative measurements and easier archival. This paper describes systemic experimental approaches and image quality metrics to compare imaging performance of CR with film-based radiography. Experiments were designed using six-sigma methodology to validate performance of CR for steel thickness up to 160 mm with Ir- 192, Co-60 and x-ray energies varying from 100 kV up to 9 MeV. Weld specimens with defects such as lack of fusion, penetration, cracks, concavity, and porosities were studied for evaluating radiographic sensitivity and imaging performance of the system. Attempts were also made to quantify probability of detection using specimens with artificial and natural defects for various experimental conditions and were compared with film based systems. (authors)

  8. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  9. Experimental studies on a new highly porous hydroxyapatite matrix for obliterating open mastoid cavities.

    Punke, Christoph; Zehlicke, Thorsten; Boltze, Carsten; Pau, Hans Wilhelm

    2008-09-01

    In an initial preliminary study, the applicability of a new high-porosity hydroxyapatite (HA) ceramic for obliterating large open mastoid cavities was proven and tested in an animal model (bulla of guinea pig). Experimental study. NanoBone, a highly porous matrix consisting of 76% hydroxyl apatite and 24% silicone dioxide fabricated in a sol-gel technique, was administered unilaterally into the opened bullae of 30 guinea pigs. In each animal, the opposite bulla was filled with Bio-Oss, a bone substitute consisting of a portion of mineral bovine bone. Histologic evaluations were performed 1, 2, 3, 4, 5, and 12 weeks after the implantation. After an initial phase in which the ceramic granules were surrounded by inflammatory cells (1-2 wk), there were increasing signs of vascularization. Osteoneogenesis and-at the same time-resorption of the HA ceramic were observed after the third week. No major difference in comparison to the bovine bone material could be found. Our results confirm the favorable qualities of the new ceramic reported in association with current maxillofacial literature. Conventional HA granules used for mastoid obliteration to date often showed problems with prolonged inflammatory reactions and, finally, extrusions. In contrast to those ceramics, the new material seems to induce more osteoneogenesis and undergoes early resorption probably due to its high porosity. Overall, it is similar to the bovine bone substance tested on the opposite ear in each animal. Further clinical studies may reveal whether NanoBone can be an adequate material for obliterating open mastoid cavities in patients.

  10. Simple heuristic derivation of some charge-transfer probabilities at asymptotically high incident velocities

    Spruch, L.; Shakeshaft, R.

    1984-01-01

    For asymptotically high incident velocities we provide simple, heuristic, almost classical, derivations of the cross section for forward charge transfer, and of the ratio of the cross section for capture to the elastic-scattering cross section for the projectile scattered through an angle close to π/3

  11. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  12. Ruin probabilities

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  13. Generalized Probability-Probability Plots

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  14. Probability-1

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  15. Ignition Probability

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  16. Conclusion: probable and possible futures. MRI with ultra high magnetic field

    Le Bihan, D.

    2009-01-01

    MR neuroimaging does not interfere with brain function. Because it is safe, it can be used to study the brains of both patients and healthy volunteers. The tasks performed by neurons depend largely on their precise location, and high-field magnets have the potential to provide a 5- to 10-fold increase in spatio-temporal resolution. This should allow brain function to be studied on a scale of only a few thousand neurons, possibly at the intermediate scale of the 'neural code'. NeuroSpin, a new CEA research center, is dedicated to neuro-MRI at high magnetic field strengths. As a forum for dialogue between those developing and those using these instruments, it brings together researchers and engineers, technicians and medical doctors. NeuroSpin is one of the few institutions in Europe, if not the world, where these experts can come together in one place to design, construct and use machines equipped with ultra-strong magnets. The strongest 'routine' MR device currently operates at 3 Tesla (60 000 times the earth's magnetic field), whereas a first French system operating at 7 Tesla (140 000 times the earth's field) is now available for human studies, and another system operating at 11.7 Tesla (world record) should be delivered in 2011. Preclinical studies are also being conducted with magnets operating at 7 Tesla and, soon, 17.6 Tesla. (author)

  17. Open ISEmeter: An open hardware high-impedance interface for potentiometric detection

    Salvador, C.; Carbajo, J.; Mozo, J. D., E-mail: jdaniel.mozo@diq.uhu.es [Applied Electrochemistry Laboratory, Faculty of Experimental Sciences, University of Huelva, Av. 3 de Marzo s/n., 21007 Huelva (Spain); Mesa, M. S.; Durán, E. [Department of Electronics Engineering, Computers and Automatic, ETSI, University of Huelva, Campus de La Rabida, 21810 Huelva (Spain); Alvarez, J. L. [Department of Information Technologies, ETSI, University of Huelva, Campus de La Rabida, 21810 Huelva (Spain)

    2016-05-15

    In this work, a new open hardware interface based on Arduino to read electromotive force (emf) from potentiometric detectors is presented. The interface has been fully designed with the open code philosophy and all documentation will be accessible on web. The paper describes a comprehensive project including the electronic design, the firmware loaded on Arduino, and the Java-coded graphical user interface to load data in a computer (PC or Mac) for processing. The prototype was tested by measuring the calibration curve of a detector. As detection element, an active poly(vinyl chloride)-based membrane was used, doped with cetyltrimethylammonium dodecylsulphate (CTA{sup +}-DS{sup −}). The experimental measures of emf indicate Nernstian behaviour with the CTA{sup +} content of test solutions, as it was described in the literature, proving the validity of the developed prototype. A comparative analysis of performance was made by using the same chemical detector but changing the measurement instrumentation.

  18. Open ISEmeter: An open hardware high-impedance interface for potentiometric detection

    Salvador, C.; Carbajo, J.; Mozo, J. D.; Mesa, M. S.; Durán, E.; Alvarez, J. L.

    2016-01-01

    In this work, a new open hardware interface based on Arduino to read electromotive force (emf) from potentiometric detectors is presented. The interface has been fully designed with the open code philosophy and all documentation will be accessible on web. The paper describes a comprehensive project including the electronic design, the firmware loaded on Arduino, and the Java-coded graphical user interface to load data in a computer (PC or Mac) for processing. The prototype was tested by measuring the calibration curve of a detector. As detection element, an active poly(vinyl chloride)-based membrane was used, doped with cetyltrimethylammonium dodecylsulphate (CTA"+-DS"−). The experimental measures of emf indicate Nernstian behaviour with the CTA"+ content of test solutions, as it was described in the literature, proving the validity of the developed prototype. A comparative analysis of performance was made by using the same chemical detector but changing the measurement instrumentation.

  19. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  20. ATLAS OpenData and OpenKey: using low tech computational tools for students training in High Energy Physics

    Sanchez Pineda, Arturos; The ATLAS collaboration

    2018-01-01

    One of the big challenges in High Energy Physics development is the fact that many potential -and very valuable- students and young researchers live in countries where internet access and computational infrastructure are poor compared to institutions already participating. In order to accelerate the process, the ATLAS Open Data project releases useful and meaningful data and tools using standard and easy-to-deploy computational means, such as custom and light Linux Virtual Machines, open source technologies, web and desktop applications. The ATLAS Open Key, a simple USB pen, allows transporting all those resources around the globe. As simple as it sounds, this approach is helping to train students that are now PhD candidates and to integrate HEP educational programs at Master level in universities where did not exist before. The software tools and resources used will be presented, as well as results and stories, ideas and next steps of the ATLAS Open Data project.

  1. Serial follow up V/P scanning in assessment of treatment response in high probability scans for pulmonary embolism

    Moustafa, H; Elhaddad, SH; Wagih, SH; Ziada, G; Samy, A; Saber, R [Department of nuclear medicine and radiology, faculty of medicine, Cairo university, Cairo, (Egypt)

    1995-10-01

    138 patients proved with V/P scan to have different probabilities of pulmonary emboli event. Serial follow up scanning after 3 days, 2 weeks, 1 month and 3 months was done, with anticoagulant therapy. Out of the remaining 10 patients, 6 patients died with documented P.E. by P.M. study and lost follow up recorded in 4 patients. Complete response with disappearance of all perfusion defects after 2 weeks was detected in 37 patients (49.3%), partial improvement of lesions after 3 months was elicited in 32%. The overall incidence of response was (81.3%) such response was complete in low probability group (100%), (84.2%) in intermediate group and (79.3%) in high probability group with partial response in 45.3%. New lesions were evident in 18.7% of this series. To conclude that serial follow up V/P scan is mandatory for evaluation of response to anticoagulant therapy specially in first 3 months. 2 figs., 3 tabs.

  2. Balancing forest-regeneration probabilities and maintenance costs in dry grasslands of high conservation priority

    Bolliger, Janine; Edwards, Thomas C.; Eggenberg, Stefan; Ismail, Sascha; Seidl, Irmi; Kienast, Felix

    2011-01-01

    Abandonment of agricultural land has resulted in forest regeneration in species-rich dry grasslands across European mountain regions and threatens conservation efforts in this vegetation type. To support national conservation strategies, we used a site-selection algorithm (MARXAN) to find optimum sets of floristic regions (reporting units) that contain grasslands of high conservation priority. We sought optimum sets that would accommodate 136 important dry-grassland species and that would minimize forest regeneration and costs of management needed to forestall predicted forest regeneration. We did not consider other conservation elements of dry grasslands, such as animal species richness, cultural heritage, and changes due to climate change. Optimal sets that included 95–100% of the dry grassland species encompassed an average of 56–59 floristic regions (standard deviation, SD 5). This is about 15% of approximately 400 floristic regions that contain dry-grassland sites and translates to 4800–5300 ha of dry grassland out of a total of approximately 23,000 ha for the entire study area. Projected costs to manage the grasslands in these optimum sets ranged from CHF (Swiss francs) 5.2 to 6.0 million/year. This is only 15–20% of the current total estimated cost of approximately CHF30–45 million/year required if all dry grasslands were to be protected. The grasslands of the optimal sets may be viewed as core sites in a national conservation strategy.

  3. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  4. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  5. Factors associated with high probability of target blood pressure non-achievement in hypertensive patients

    S. P. Zhemanyuk

    2017-12-01

    Full Text Available One of the topic issue of modern cardiology is factors of target blood pressure level non-achievement clarifying due to a better understanding how we can reduce cardiovascular complications. The aim of the study is to determine the factors of poor blood pressure control using the ambulatory blood pressure monitoring parameters and adenosine 5'-diphosphate-induced platelet aggregation parameters in patients with arterial hypertension. Material and methods. The study involved 153 patients with essential hypertension (EH stage II, II degree. The ambulatory blood pressure monitoring (ABPM was performed in patients during at least two of first-line antihypertensive drugs in optimal daily doses usage by the ABPM bifunctional device (Incart, S.-P., R.F.. Platelet aggregation was carried out using light transmittance aggregation by optical analyzer (Solar, R.B. with adenosine 5'-diphosphate (Sigma-Aldrich at final concentration of 10.0 × 10-6 mol / L. The first group were inadequately controlled essential hypertensive individuals with high systolic or/and diastolic BP level according to the ABPM results, and the second one were patients with adequately controlled EH. Groups of patients were comparable in age (60.39 ± 10.74 years vs. 62.80 ± 9.63; p = 0.181, respectively. In the group of EH patients who reached the target level of blood pressure, women predominated (60% vs. 39.81%; p = 0.021, respectively. We used the binary logistic regression analysis to determine the predictors of target blood pressure level poor reaching using ABPM and platelet aggregation parameters. Results According to the univariate logistic regression analysis, the dependent factors influencing the target blood pressure level poor reaching are the average diurnal diastolic blood pressure (DBP (OR = 44.8; diurnal variability of systolic blood pressure (SBP (OR = 4.4; square index of hypertension for diurnal periods SBP (OR = 318.9; square index of hypertension for diurnal

  6. Quantum Probabilities as Behavioral Probabilities

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  7. Risk Probabilities

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  8. Open principle for large high-resolution solar telescopes

    Hammerschlag, R.H.; Bettonvil, F.C.M.; Jägers, A.P.L.; Sliepen, G.

    2009-01-01

    Vacuum solar telescopes solve the problem of image deterioration inside the telescope due to refractive index fluctuations of the air heated by the solar light. However, such telescopes have a practical diameter limit somewhat over 1 m. The Dutch Open Telescope (DOT) was the pioneering demonstrator

  9. Early diagnosis and research of high myopia with primary open angle glaucoma

    Yan Guo

    2014-04-01

    Full Text Available People with high myopia are high risk populations to have primary open angle glaucoma. Clinically, we found that patients with primary open angle glaucoma and high myopia is closely related. So to understand the clinical features of high myopia with primary open angle glaucoma and the importance of early diagnosis, to avoiding missed diagnosis or lower misdiagnosed rate, can help to improve the vigilance and level of early diagnosis of the clinicians. In this paper, high myopia with clinical features of primary open angle glaucoma, and the research progress on the main points of early diagnosis were reviewed.

  10. Probability tales

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  11. Probability theory

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  12. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  13. High-altitude cosmic ray neutrons: probable source for the high-energy protons of the earth's radiation belts

    Hajnal, F.; Wilson, J.

    1992-01-01

    'Full Text:' Several High-altitude cosmic-ray neutron measurements were performed by the NASA Ames Laboratory in the mid-to late-1970s using airplanes flying at about 13km altitude along constant geomagnetic latitudes of 20, 44 and 51 degrees north. Bonner spheres and manganese, gold and aluminium foils were used in the measurements. In addition, large moderated BF-3 counters served as normalizing instruments. Data analyses performed at that time did not provide complete and unambiguous spectral information and field intensities. Recently, using our new unfolding methods and codes, and Bonner-sphere response function extensions for higher energies, 'new' neutron spectral intensities were obtained, which show progressive hardening of neutron spectra as a function of increasing geomagnetic latitude, with substantial increases in the energy region iron, 1 0 MeV to 10 GeV. For example, we found that the total neutron fluences at 20 and 51 degrees magnetic north are in the ratio of 1 to 5.2 and the 10 MeV to 10 GeV fluence ratio is 1 to 18. The magnitude of these ratios is quite remarkable. From the new results, the derived absolute neutron energy distribution is of the correct strength and shape for the albedo neutrons to be the main source of the high-energy protons trapped in the Earth's inner radiation belt. In addition, the results, depending on the extrapolation scheme used, indicate that the neutron dose equivalent rate may be as high as 0.1 mSv/h near the geomagnetic north pole and thus a significant contributor to the radiation exposures of pilots, flight attendants and the general public. (author)

  14. High Throughput PBTK: Open-Source Data and Tools for ...

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  15. E-cigarette openness, curiosity, harm perceptions and advertising exposure among U.S. middle and high school students.

    Margolis, Katherine A; Donaldson, Elisabeth A; Portnoy, David B; Robinson, Joelle; Neff, Linda J; Jamal, Ahmed

    2018-07-01

    Understanding factors associated with youth e-cigarette openness and curiosity are important for assessing probability of future use. We examined how e-cigarette harm perceptions and advertising exposure are associated with openness and curiosity among tobacco naive youth. Findings from the 2015 National Youth Tobacco Survey (NYTS) were analyzed. The 2015 NYTS is a nationally representative survey of 17,711 U.S. middle and high school students. We calculated weighted prevalence estimates of never users of tobacco products (cigarettes, cigars/cigarillos/little cigars, waterpipe/hookah, smokeless tobacco, bidis, pipes, dissolvables, e-cigarettes) who were open to or curious about e-cigarette use, by demographics. Weighted regression models examined how e-cigarette harm perceptions and advertising exposure were associated with openness using e-cigarettes and curiosity about trying e-cigarettes. Among respondents who never used tobacco products, 23.8% were open to using e-cigarettes and 25.4% were curious. Respondents that perceived e-cigarettes cause a lot of harm had lower odds of both openness (OR = 0.10, 95% CI = 0.07, 0.15) and curiosity about e-cigarettes (OR = 0.10, 95% CI = 0.07, 0.13) compared to those with lower harm perception. Respondents who reported high exposure to e-cigarette advertising in stores had greater odds of being open to e-cigarette use (OR = 1.22, 95% CI = 1.03, 1.44) and highly curious (OR = 1.25, 95% CI = 1.01, 1.53) compared to those not highly exposed. These findings demonstrate that youth exposed to e-cigarette advertising are open and curious to e-cigarette use. These findings could help public health practitioners better understand the interplay of advertising exposure and harm perceptions with curiosity and openness to e-cigarette use in a rapidly changing marketplace. Published by Elsevier Inc.

  16. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  17. Changes in the high-mountain vegetation of the Central Iberian Peninsula as a probable sign of global warming.

    Sanz-Elorza, Mario; Dana, Elías D; González, Alberto; Sobrino, Eduardo

    2003-08-01

    Aerial images of the high summits of the Spanish Central Range reveal significant changes in vegetation over the period 1957 to 1991. These changes include the replacement of high-mountain grassland communities dominated by Festuca aragonensis, typical of the Cryoro-Mediterranean belt, by shrub patches of Juniperus communis ssp. alpina and Cytisus oromediterraneus from lower altitudes (Oro-Mediterranean belt). Climatic data indicate a shift towards warmer conditions in this mountainous region since the 1940s, with the shift being particularly marked from 1960. Changes include significantly higher minimum and maximum temperatures, fewer days with snow cover and a redistribution of monthly rainfall. Total yearly precipitation showed no significant variation. There were no marked changes in land use during the time frame considered, although there were minor changes in grazing species in the 19th century. It is hypothesized that the advance of woody species into higher altitudes is probably related to climate change, which could have acted in conjunction with discrete variations in landscape management. The pronounced changes observed in the plant communities of the area reflect the susceptibility of high-mountain Mediterranean species to environmental change.

  18. The prevalence of probable delayed-sleep-phase syndrome in students from junior high school to university in Tottori, Japan.

    Hazama, Gen-i; Inoue, Yuichi; Kojima, Kazushige; Ueta, Toshiyuki; Nakagome, Kazuyuki

    2008-09-01

    Delayed sleep phase syndrome (DSPS) is a circadian rhythm sleep disorder with a typical onset in the second decade of life. DSPS is characterized by the sleep-onset insomnia and the difficulty in waking at the desired time in the morning. Although DSPS is associated with inability to attend school, the prevalence has been controversial. To elucidate a change in the prevalence of DSPS among young population, epidemiological survey was conducted on Japanese students. A total of 4,971 students of junior high school, senior high school, and university were enrolled in this cross sectional study in Tottori Prefecture. They answered anonymous screening questionnaire regarding school schedule, sleep hygiene and symptomatic items of sleep disorders. The prevalence of probable DSPS was estimated at 0.48% among the total subject students without gender difference. In university, the prevalence of the last year students showed the highest value (1.66%), while that of the first year students showed the lowest value (0.09%) among all school years from junior high school to university. The prevalence increased with advancing university school years. Thus, a considerable number of Japanese students are affected with DSPS. Senior students of university are more vulnerable to the disorder than younger students. Appropriate school schedule may decrease the mismatch between the individual's sleep-wake cycle and the school schedule. Promotion of a regular sleep habit is necessary to prevent DSPS among this population.

  19. Impact of high-flux haemodialysis on the probability of target attainment for oral amoxicillin/clavulanic acid combination therapy.

    Hui, Katrina; Patel, Kashyap; Kong, David C M; Kirkpatrick, Carl M J

    2017-07-01

    Clearance of small molecules such as amoxicillin and clavulanic acid is expected to increase during high-flux haemodialysis, which may result in lower concentrations and thus reduced efficacy. To date, clearance of amoxicillin/clavulanic acid (AMC) during high-flux haemodialysis remains largely unexplored. Using published pharmacokinetic parameters, a two-compartment model with first-order input was simulated to investigate the impact of high-flux haemodialysis on the probability of target attainment (PTA) of orally administered AMC combination therapy. The following pharmacokinetic/pharmacodynamic targets were used to calculate the PTA. For amoxicillin, the time that the free concentration remains above the minimum inhibitory concentration (MIC) of ≥50% of the dosing period (≥50%ƒT >MIC ) was used. For clavulanic acid, the time that the free concentration was >0.1 mg/L of ≥45% of the dosing period (≥45%ƒT >0.1 mg/L ) was used. Dialysis clearance reported in low-flux haemodialysis for both compounds was doubled to represent the likely clearance during high-flux haemodialysis. Monte Carlo simulations were performed to produce concentration-time profiles over 10 days in 1000 virtual patients. Seven different regimens commonly seen in clinical practice were explored. When AMC was dosed twice daily, the PTA was mostly ≥90% for both compounds regardless of when haemodialysis commenced. When administered once daily, the PTA was 20-30% for clavulanic acid and ≥90% for amoxicillin. The simulations suggest that once-daily orally administered AMC in patients receiving high-flux haemodialysis may result in insufficient concentrations of clavulanic acid to effectively treat infections, especially on days when haemodialysis occurs. Copyright © 2017 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  20. Fast-opening vacuum switches for high-power inductive energy storage

    Cooperstein, G.

    1988-01-01

    The subject of fast-opening vacuum switches for high-power inductive energy storage is emerging as an exciting new area of plasma science research. This opening switch technology, which generally involves the use of plasmas as the switching medium, is key to the development of inductive energy storage techniques for pulsed power which have a number of advantages over conventional capacitive techniques with regard to cost and size. This paper reviews the state of the art in this area with emphasis on applications to inductive storage pulsed power generators. Discussion focuses on fast-opening vacuum switches capable of operating at high power (≥10 12 W). These include plasma erosion opening switches, ion beam opening switches, plasma filled diodes, reflex diodes, plasma flow switches, and other novel vacuum opening switches

  1. Analysis of high-quality modes in open chaotic microcavities

    Fang, W.; Yamilov, A.; Cao, H.

    2005-01-01

    We present a numerical study of the high-quality modes in two-dimensional dielectric stadium microcavities. Although the classical ray mechanics is fully chaotic in a stadium billiard, all of the high-quality modes show a 'strong scar' around unstable periodic orbits. When the deformation (ratio of the length of the straight segments over the diameter of the half circles) is small, the high-quality modes correspond to whispering-gallery-type trajectories and their quality factors decrease monotonically with increasing deformation. At large deformation, each high-quality mode is associated with multiple unstable periodic orbits. Its quality factor changes nonmonotonically with the deformation, and there exists an optimal deformation for each mode at which its quality factor reaches a local maximum. This unusual behavior is attributed to the interference of waves propagating along different constituent orbits that could minimize light leakage out of the cavity

  2. Jane Austen in the High School Classroom (Open to Suggestion).

    Fritzer, Penelope

    1996-01-01

    Argues that Jane Austen's novels lend themselves to the high school curriculum, and that students will discover a leisurely, rural world in which the concerns of the young people are often similar to theirs. (SR)

  3. open-quotes High magnetic fields in the USAclose quotes

    Campbell, L.J.; Parkin, D.M.; Crow, J.E.; Schneider-Muntau, H.J.; Sullivan, N.S.

    1994-01-01

    During the past thirty years research using high magnetic fields has technically evolved in the manner, but not the magnitude, of the so-called big science areas of particle physics, plasma physics, neutron scattering, synchrotron light scattering, and astronomy. Starting from the laboratories of individual researchers it moved to a few larger universities, then to centralized national facilities with research and maintenance staffs, and, finally, to joint international ventures to build unique facilities, as illustrated by the subject of this conference. To better understand the nature of this type of research and its societal justification it is helpful to compare it, in general terms, with the aforementioned big-science fields. High magnetic field research differs from particle physics, plasma physics, and astronomy in three respects: (1) It is generic research that cuts across a wide range of scientific disciplines in physics, chemistry, biology, medicine, and engineering; (2) It studies materials and processes that are relevant for a variety of technological applications and it gives insight into biological processes; (3) It has produced, at least, comparably significant results with incomparably smaller resources. Unlike neutron and synchrotron light scattering, which probe matter, high magnetic fields change the thermodynamic state of matter. This change of state is fundamental and independent of other state variables, such as pressure and temperature. After the magnetic field is applied, various techniques are then used to study the new state

  4. Application of plasma erosion opening switches to high power accelerators for pulse compression and power multiplication

    Meyer, R.A.; Boller, J.R.; Commisso, R.J.

    1983-01-01

    A new vacuum opening switch called a plasma erosion opening switch is described. A model of its operation is presented and the energy efficiency of such a switch is discussed. Recent high power experiments on the Gamble II accelerator are described and compared to previous experiments

  5. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Deriving animal behaviour from high-frequency GPS: tracking cows in open and forested habitat

    de Weerd, N.; van Langevelde, F.; van Oeveren, H.; Nolet, Bart A.; Kölzsch, Andrea; Prins, H.H.T.; De Boer, W.F.

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable

  7. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  8. News from the Library: Publishing Open Access articles beyond High Energy Physics

    CERN Library

    2012-01-01

    CERN has supported Open Access Publishing for many years, and the Scientific Information Service is working to implement this vision. We have just launched the flagship project SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) aimed at converting high-quality journals in High Energy Physics to Open Access for articles published as of 2014. More details here.   In parallel, several win-win arrangements allow experimental and theoretical high-energy physics results from CERN to be published in Open Access in a variety of high-impact journals. More information can be found here. Open Access publishing at CERN goes far beyond High Energy Physics. Indeed, CERN is a key supporter of Open Access in accelerator science, through sponsorship of the APS journal PRSTAB and participation in the JACoW collaboration. Now CERN authors publishing in the field of engineering will also have th...

  9. Open Access Publishing in High-Energy Physics: the SCOAP(3) Initiative

    Mele, Salvatore

    2010-01-01

    Scholarly communication in High-Energy Physics (HEP) shows traits very similar to Astronomy and Astrophysics: pervasiveness of Open Access to preprints through community-based services; a culture of openness and sharing among its researchers; a compact number of yearly articles published by a relatively small number of journals which are dear to the community. These aspects have led HEP to spearhead an innovative model for the transition of its scholarly publishing to Open Access. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP(3)) aims to be a central body to finance peer-review service rather than the purchase of access to information as in the traditional subscription model, with all articles in the discipline eventually available in Open Access. Sustainable funding to SCOAP(3) would come from libraries, library consortia and HEP funding agencies, through a re-direction of funds currently spent for subscriptions to HEP journals. This paper presents the cultural and bibliomet...

  10. High School Teachers' Openness to Adopting New Practices: The Role of Personal Resources and Organizational Climate.

    Johnson, Stacy R; Pas, Elise T; Loh, Deanna; Debnam, Katrina J; Bradshaw, Catherine P

    2017-03-01

    Although evidence-based practices for students' social, emotional, and behavioral health are readily available, their adoption and quality implementation in schools are of increasing concern. Teachers are vital to implementation; yet, there is limited research on teachers' openness to adopting new practices, which may be essential to successful program adoption and implementation. The current study explored how perceptions of principal support, teacher affiliation, teacher efficacy, and burnout relate to teachers' openness to new practices. Data came from 2,133 teachers across 51 high schools. Structural equation modeling assessed how organizational climate (i.e., principal support and teacher affiliation) related to teachers' openness directly and indirectly via teacher resources (i.e., efficacy and burnout). Teachers with more favorable perceptions of both principal support and teacher affiliation reported greater efficacy, and, in turn, more openness; however, burnout was not significantly associated with openness. Post hoc analyses indicated that among teachers with high levels of burnout, only principal support related to greater efficacy, and in turn, higher openness. Implications for promoting teachers' openness to new program adoption are discussed.

  11. Open high-level data formats and software for gamma-ray astronomy

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  12. Hydralazine-induced vasodilation involves opening of high conductance Ca2+-activated K+ channels

    Bang, Lone; Nielsen-Kudsk, J E; Gruhn, N

    1998-01-01

    The purpose of this study was to investigate whether high conductance Ca2+-activated K+ channels (BK(Ca)) are mediating the vasodilator action of hydralazine. In isolated porcine coronary arteries, hydralazine (1-300 microM), like the K+ channel opener levcromakalim, preferentially relaxed......M) suppressed this response by 82% (P opening of BK(Ca) takes part in the mechanism whereby...

  13. Which Type of Inquiry Project Do High School Biology Students Prefer: Open or Guided?

    Sadeh, Irit; Zion, Michal

    2012-10-01

    In teaching inquiry to high school students, educators differ on which method of teaching inquiry is more effective: Guided or open inquiry? This paper examines the influence of these two different inquiry learning approaches on the attitudes of Israeli high school biology students toward their inquiry project. The results showed significant differences between the two groups: Open inquiry students were more satisfied and felt they gained benefits from implementing the project to a greater extent than guided inquiry students. On the other hand, regarding documentation throughout the project, guided inquiry students believed that they conducted more documentation, as compared to their open inquiry peers. No significant differences were found regarding `the investment of time', but significant differences were found in the time invested and difficulties which arose concerning the different stages of the inquiry process: Open inquiry students believed they spent more time in the first stages of the project, while guided inquiry students believed they spent more time in writing the final paper. In addition, other differences were found: Open inquiry students felt more involved in their project, and felt a greater sense of cooperation with others, in comparison to guided inquiry students. These findings may help teachers who hesitate to teach open inquiry to implement this method of inquiry; or at least provide their students with the opportunity to be more involved in inquiry projects, and ultimately provide their students with more autonomy, high-order thinking, and a deeper understanding in performing science.

  14. Repetitive plasma opening switch for powerful high-voltage pulse generators

    Dolgachev, G.I.; Zakatov, L.P.; Nitishinskii, M.S.; Ushakov, A.G.

    1998-01-01

    Results are presented of experimental studies of plasma opening switches that serve to sharpen the pulses of inductive microsecond high-voltage pulse generators. It is demonstrated that repetitive plasma opening switches can be used to create super-powerful generators operating in a quasi-continuous regime. An erosion switching mechanism and the problem of magnetic insulation in repetitive switches are considered. Achieving super-high peak power in plasma switches makes it possible to develop new types of high-power generators of electron beams and X radiation. Possible implementations and the efficiency of these generators are discussed

  15. Open-field behavior of house mice selectively bred for high voluntary wheel-running.

    Bronikowski, A M; Carter, P A; Swallow, J G; Girard, I A; Rhodes, J S; Garland, T

    2001-05-01

    Open-field behavioral assays are commonly used to test both locomotor activity and emotionality in rodents. We performed open-field tests on house mice (Mus domesticus) from four replicate lines genetically selected for high voluntary wheel-running for 22 generations and from four replicate random-bred control lines. Individual mice were recorded by video camera for 3 min in a 1-m2 open-field arena on 2 consecutive days. Mice from selected lines showed no statistical differences from control mice with respect to distance traveled, defecation, time spent in the interior, or average distance from the center of the arena during the trial. Thus, we found little evidence that open-field behavior, as traditionally defined, is genetically correlated with wheel-running behavior. This result is a useful converse test of classical studies that report no increased wheel-running in mice selected for increased open-field activity. However, mice from selected lines turned less in their travel paths than did control-line mice, and females from selected lines had slower travel times (longer latencies) to reach the wall. We discuss these results in the context of the historical open-field test and newly defined measures of open-field activity.

  16. Thermal performance of an open thermosyphon using nanofluid for evacuated tubular high temperature air solar collector

    Liu, Zhen-Hua; Hu, Ren-Lin; Lu, Lin; Zhao, Feng; Xiao, Hong-shen

    2013-01-01

    Highlights: • A novel solar air collector with simplified CPC and open thermosyphon is designed and tested. • Simplified CPC has a much lower cost at the expense of slight efficiency loss. • Nanofluid effectively improves thermal performance of the above solar air collector. • Solar air collector with open thermosyphon is better than that with concentric tube. - Abstract: A novel evacuated tubular solar air collector integrated with simplified CPC (compound parabolic concentrator) and special open thermosyphon using water based CuO nanofluid as the working fluid is designed to provide air with high and moderate temperature. The experimental system has two linked panels and each panel includes an evacuated tube, a simplified CPC and an open thermosyphon. Outdoor experimental study has been carried out to investigate the actual solar collecting performance of the designed system. Experimental results show that air outlet temperature and system collecting efficiency of the solar air collector using nanofluid as the open thermosyphon’s working fluid are both higher than that using water. Its maximum air outlet temperature exceeds 170 °C at the air volume rate of 7.6 m 3 /h in winter, even though the experimental system consists of only two collecting panels. The solar collecting performance of the solar collector integrated with open thermosyphon is also compared with that integrated with common concentric tube. Experimental results show that the solar collector integrated with open thermosyphon has a much better collecting performance

  17. Which Type of Inquiry Project Do High School Biology Students Prefer: Open or Guided?

    Sadeh, Irit; Zion, Michal

    2012-01-01

    In teaching inquiry to high school students, educators differ on which method of teaching inquiry is more effective: Guided or open inquiry? This paper examines the influence of these two different inquiry learning approaches on the attitudes of Israeli high school biology students toward their inquiry project. The results showed significant…

  18. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation.

    Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S

    2013-01-08

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.

  19. A Hall-current model of electron loss after POS opening into high-impedance loads

    Greenly, J.B.

    1989-01-01

    The author discusses how a self-consistent relativistic model of laminar Hall (E x B) electron flow across a POS plasma allows a loss mechanism after opening even in a strongly magnetically-insulated line, downstream of the remaining POS plasma. Opening is assumed to occur at the cathode, either by erosion or push-back. The loss results only when a large voltage appears after opening into a high impedance load. Then the difference in potential between the plasma, which is near anode potential, and the cathode results in an axial component of E at the load end of the plasma, which supports an E x B drift of electrons across the gap. The analytic model predicts that this loss should increase with higher voltage after opening, and could be eliminated only by removing the plasma from the gap, or eliminating cathode electron emission (both difficult), or by confining this downstream electron flow with an applied magnetic field

  20. Is high myopia a risk factor for visual field progression or disk hemorrhage in primary open-angle glaucoma?

    Nitta, Koji; Sugiyama, Kazuhisa; Wajima, Ryotaro; Tachibana, Gaku

    2017-01-01

    The purpose of this study was to clarify differences between highly myopic and non-myopic primary open-angle glaucoma (POAG) patients, including normal-tension glaucoma patients. A total of 269 POAG patients were divided into two groups: patients with ≥26.5 mm of axial length (highly myopic group) and patients with field (VF) loss was significantly greater in the highly myopic group (10-year survival rate, 73.7%±6.8%) than in the non-myopic group (10-year survival rate, 46.3%±5.8%; log-rank test, P =0.0142). The occurrence of disk hemorrhage (DH) in the non-myopic group (1.60±3.04) was significantly greater than that in the highly myopic group (0.93±2.13, P =0.0311). The cumulative probability of DH was significantly lower in the highly myopic group (10-year survival rate, 26.4%±5.4%) than in the non-myopic group (10-year survival rate, 47.2%±6.6%, P =0.0413). Highly myopic POAG is considered as a combination of myopic optic neuropathy and glaucomatous optic neuropathy (GON). If GON is predominant, it has frequent DH and more progressive VF loss. However, when the myopic optic neuropathy is predominant, it has less DH and less progressive VF loss.

  1. High voltage, high power operation of the plasma erosion opening switch

    Neri, J.M.; Boller, J.R.; Ottinger, P.F.; Weber, B.V.; Young, F.C.

    1987-01-01

    A Plasma Erosion Opening Switch (PEOS) is used as the opening switch for a vacuum inductive storage system driven by a 1.8-MV, 1.6-TW pulsed power generator. A 135-nH vacuum inductor is current charged to ∼750 kA in 50 ns through the closed PEOS which then opens in <10 ns into an inverse ion diode load. Electrical diagnostics and nuclear activations from ions accelerated in the diode yield a peak load voltage (4.25 MV) and peak load power (2.8 TW) that are 2.4 and 1.8 times greater than ideal matched load values for the same generator pulse

  2. Probability Aggregates in Probability Answer Set Programming

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  3. The development of an open architecture control system for CBN high speed grinding

    Silva, E. Jannone da; Biffi, M.; Oliveira, J. F. G. de

    2004-01-01

    The aim of this project is the development of an open architecture controlling (OAC) system to be applied in the high speed grinding process using CBN tools. Besides other features, the system will allow a new monitoring and controlling strategy, by the adoption of open architecture CNC combined with multi-sensors, a PC and third-party software. The OAC system will be implemented in a high speed CBN grinding machine, which is being developed in a partnership between the University of São Paul...

  4. The effect of lower anterior high pull headgear on treatment of moderate open bite in adults

    Rahman Showkatbakhsh

    2012-01-01

    Full Text Available Background and Aims: Various methods are used for treatment of open bite. The objective of this study was to investigate the effects of Lower Anterior High Pull Headgear (LAHPH appliance in Class I subjects with moderate open bite and high lower lip line.Materials and Methods: The study group was composed of 10 subjects with a mean age of 15.8±2.5 years and 3.05 ± 0.07 mm moderate open bite. All the patients rejected orthognathic surgery. The treatment included extraction of upper and lower second premolars followed by leveling, banding, bonding, posterior space closure, and anterior retraction. After these procedures, the open bite was reduced to 2.04±1.17 mm. Afterwards, LAHPH was applied for 18 hours per day for 8±2 months. LAHPH appliance was composed of High Pull Headgear and two hooks mounted on its inner bow. Two elastics (1.8, light, Dentaurum connected the upper hooks on the inner bow to the lower hooks on the mandibular canines vertically. The forces produced by the prescribed elastics were 10 and 60 g during mouth closing and opening, respectively. Paired T-test was used to evaluate pre-andpost-treatment outcomes.Results: The pre-and post-treatment cephalometric evaluations showed that the LAHPH reduced effectively the open bite of the patients to 0.15±1.7 mm (P<0.001.Conclusion: This appliance can be used as an acceptable method for closing the open bite in Class I subjects.

  5. Introduction to imprecise probabilities

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  6. Ammonia losses and nitrogen partitioning at a southern High Plains open lot dairy

    Todd, Richard W.; Cole, N. Andy; Hagevoort, G. Robert; Casey, Kenneth D.; Auvermann, Brent W.

    2015-06-01

    Animal agriculture is a significant source of ammonia (NH3). Cattle excrete most ingested nitrogen (N); most urinary N is converted to NH3, volatilized and lost to the atmosphere. Open lot dairies on the southern High Plains are a growing industry and face environmental challenges as well as reporting requirements for NH3 emissions. We quantified NH3 emissions from the open lot and wastewater lagoons of a commercial New Mexico dairy during a nine-day summer campaign. The 3500-cow dairy consisted of open lot, manure-surfaced corrals (22.5 ha area). Lactating cows comprised 80% of the herd. A flush system using recycled wastewater intermittently removed manure from feeding alleys to three lagoons (1.8 ha area). Open path lasers measured atmospheric NH3 concentration, sonic anemometers characterized turbulence, and inverse dispersion analysis was used to quantify emissions. Ammonia fluxes (15-min) averaged 56 and 37 μg m-2 s-1 at the open lot and lagoons, respectively. Ammonia emission rate averaged 1061 kg d-1 at the open lot and 59 kg d-1 at the lagoons; 95% of NH3 was emitted from the open lot. The per capita emission rate of NH3 was 304 g cow-1 d-1 from the open lot (41% of N intake) and 17 g cow-1 d-1 from lagoons (2% of N intake). Daily N input at the dairy was 2139 kg d-1, with 43, 36, 19 and 2% of the N partitioned to NH3 emission, manure/lagoons, milk, and cows, respectively.

  7. Prioritizing forest fuels treatments based on the probability of high-severity fire restores adaptive capacity in Sierran forests.

    Krofcheck, Daniel J; Hurteau, Matthew D; Scheller, Robert M; Loudermilk, E Louise

    2018-02-01

    In frequent fire forests of the western United States, a legacy of fire suppression coupled with increases in fire weather severity have altered fire regimes and vegetation dynamics. When coupled with projected climate change, these conditions have the potential to lead to vegetation type change and altered carbon (C) dynamics. In the Sierra Nevada, fuels reduction approaches that include mechanical thinning followed by regular prescribed fire are one approach to restore the ability of the ecosystem to tolerate episodic fire and still sequester C. Yet, the spatial extent of the area requiring treatment makes widespread treatment implementation unlikely. We sought to determine if a priori knowledge of where uncharacteristic wildfire is most probable could be used to optimize the placement of fuels treatments in a Sierra Nevada watershed. We developed two treatment placement strategies: the naive strategy, based on treating all operationally available area and the optimized strategy, which only treated areas where crown-killing fires were most probable. We ran forecast simulations using projected climate data through 2,100 to determine how the treatments differed in terms of C sequestration, fire severity, and C emissions relative to a no-management scenario. We found that in both the short (20 years) and long (100 years) term, both management scenarios increased C stability, reduced burn severity, and consequently emitted less C as a result of wildfires than no-management. Across all metrics, both scenarios performed the same, but the optimized treatment required significantly less C removal (naive=0.42 Tg C, optimized=0.25 Tg C) to achieve the same treatment efficacy. Given the extent of western forests in need of fire restoration, efficiently allocating treatments is a critical task if we are going to restore adaptive capacity in frequent-fire forests. © 2017 John Wiley & Sons Ltd.

  8. High temperature high velocity direct power extraction using an open-cycle oxy-combustion system

    Love, Norman [Univ. of Texas, El Paso, TX (United States)

    2017-09-29

    The implementation of oxy-fuel technology in fossil-fuel power plants may contribute to increased system efficiencies and a reduction of pollutant emissions. One technology that has potential to utilize the temperature of undiluted oxy-combustion flames is open-cycle magnetohydrodynamic (MHD) power generators. These systems can be configured as a topping cycle and provide high enthalpy, electrically conductive flows for direct conversion of electricity. This report presents the design and modeling strategies of a MHD combustor operating at temperatures exceeding 3000 K. Throughout the study, computational fluid dynamics (CFD) models were extensively used as a design and optimization tool. A lab-scale 60 kWth model was designed, manufactured and tested as part of this project. A fully-coupled numerical method was developed in ANSYS FLUENT to characterize the heat transfer in the system. This study revealed that nozzle heat transfer may be predicted through a 40% reduction of the semi-empirical Bartz correlation. Experimental results showed good agreement with the numerical evaluation, with the combustor exhibiting a favorable performance when tested during extended time periods. A transient numerical method was employed to analyze fuel injector geometries for the 60-kW combustor. The ANSYS FLUENT study revealed that counter-swirl inlets achieve a uniform pressure and velocity ratio when the ports of the injector length to diameter ratio (L/D) is 4. An angle of 115 degrees was found to increase distribution efficiency. The findings show that this oxy-combustion concept is capable of providing a high-enthalpy environment for seeding, in order to render the flow to be conductive. Based on previous findings, temperatures in the range of 2800-3000 K may enable magnetohydrodynamic power extraction. The heat loss fraction in this oxy-combustion system, based on CFD and analytical calculations, at optimal operating conditions, was estimated to be less than 10 percent

  9. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  10. ON THE ORIGIN OF HIGH-ALTITUDE OPEN CLUSTERS IN THE MILKY WAY

    Martinez-Medina, L. A.; Pichardo, B.; Moreno, E.; Peimbert, A. [Instituto de Astronomía, Universidad Nacional Autónoma de México, A.P. 70-264, 04510, México, D.F., México (Mexico); Velazquez, H., E-mail: lamartinez@astro.unam.mx [Instituto de Astronomía, Universidad Nacional Autónoma de México, Apartado Postal 877, 22860 Ensenada, B.C., México (Mexico)

    2016-01-20

    We present a dynamical study of the effect of the bar and spiral arms on the simulated orbits of open clusters in the Galaxy. Specifically, this work is devoted to the puzzling presence of high-altitude open clusters in the Galaxy. For this purpose we employ a very detailed observationally motivated potential model for the Milky Way and a careful set of initial conditions representing the newly born open clusters in the thin disk. We find that the spiral arms are able to raise an important percentage of open clusters (about one-sixth of the total employed in our simulations, depending on the structural parameters of the arms) above the Galactic plane to heights beyond 200 pc, producing a bulge-shaped structure toward the center of the Galaxy. Contrary to what was expected, the spiral arms produce a much greater vertical effect on the clusters than the bar, both in quantity and height; this is due to the sharper concentration of the mass on the spiral arms, when compared to the bar. When a bar and spiral arms are included, spiral arms are still capable of raising an important percentage of the simulated open clusters through chaotic diffusion (as tested from classification analysis of the resultant high-z orbits), but the bar seems to restrain them, diminishing the elevation above the plane by a factor of about two.

  11. Fabrication of nickel hydroxide electrodes with open-ended hexagonal nanotube arrays for high capacitance supercapacitors.

    Wu, Mao-Sung; Huang, Kuo-Chih

    2011-11-28

    A nickel hydroxide electrode with open-ended hexagonal nanotube arrays, prepared by hydrolysis of nickel chloride in the presence of hexagonal ZnO nanorods, shows a very high capacitance of 1328 F g(-1) at a discharge current density of 1 A g(-1) due to the significantly improved ion transport.

  12. Deriving animal behaviour from high-frequency GPS: tracking cows in open and forested habitat

    Weerd, de N.; Langevelde, van F.; Oeveren, van H.; Nolet, B.A.; Kölzsch, A.; Prins, H.H.T.; Boer, de W.F.

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data.We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable

  13. Scaling Qualitative Probability

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  14. Early age stress-crack opening relationships for high performance concrete

    Østergaard, Lennart; Lange, David A.; Stang, Henrik

    2004-01-01

    Stress–crack opening relationships for concrete in early age have been determined for two high performance concrete mixes with water to cementitious materials ratios of 0.307 and 0.48. The wedge splitting test setup was used experimentally and the cracked nonlinear hinge model based...... on the fictitious crack model was applied for the interpretation of the results. A newly developed inverse analysis algorithm was utilized for the extraction of the stress–crack opening relationships. Experiments were conducted at 8, 10, 13, 17, 22, 28, 48, 168 h (7 days) and 672 h (28 days). At the same ages...

  15. Effects of high frequency fluctuations on DNS of turbulent open-channel flow with high Pr passive scalar transport

    Yamamoto, Yoshinobu; Kunugi, Tomoaki; Serizawa, Akimi

    2002-01-01

    In this study, investigation on effects of high frequency fluctuations on DNS of turbulent open-channel flows with high Pr passive scalar transport was conducted. As the results, although significant differences of energy spectra behaviors in temperature fields, are caused at high wave number region where insignificant area for velocity components, large difference dose not caused in mean and statistic behaviors in temperature component. But, if the buoyancy were considered, this temperature high-frequency fluctuations would be greatly changed mean and statistics behaviors from the difference of the accuracy and resolution at high wave number region. (author)

  16. Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways

    Keting Hu

    2016-03-01

    Full Text Available In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE, Wavelet Packet Energy Tsallis Entropy (WPETE with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6, Empirical Mode Decomposition Shannon Entropy (EMDESE, and Empirical Mode Decomposition Tsallis Entropy (EMDETE with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT. The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.

  17. Open Access Publishing in High-Energy Physics: the SCOAP3 Initiative

    Mele, S.

    2010-10-01

    Scholarly communication in High-Energy Physics (HEP) shows traits very similar to Astronomy and Astrophysics: pervasiveness of Open Access to preprints through community-based services; a culture of openness and sharing among its researchers; a compact number of yearly articles published by a relatively small number of journals which are dear to the community. These aspects have led HEP to spearhead an innovative model for the transition of its scholarly publishing to Open Access. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP) aims to be a central body to finance peer-review service rather than the purchase of access to information as in the traditional subscription model, with all articles in the discipline eventually available in Open Access. Sustainable funding to SCOAP would come from libraries, library consortia and HEP funding agencies, through a re-direction of funds currently spent for subscriptions to HEP journals. This paper presents the cultural and bibliometric factors at the roots of SCOAP and the current status of this worldwide initiative.

  18. Demonstration of a High Open-Circuit Voltage GaN Betavoltaic Microbattery

    Cheng Zai-Jun; San Hai-Sheng; Chen Xu-Yuan; Liu Bo; Feng Zhi-Hong

    2011-01-01

    A high open-circuit voltage betavoltaic microbattery based on a GaN p-i-n diode is demonstrated. Under the irradiation of a 4×4 mm 2 planar solid 63 Ni source with an activity of 2 mCi, the open-circuit voltage V oc of the fabricated single 2×2mm 2 cell reaches as high as 1.62 V, the short-circuit current density J sc is measured to be 16nA/cm 2 . The microbattery has a fill factor of 55%, and the energy conversion efficiency of beta radiation into electricity reaches to 1.13%. The results suggest that GaN is a highly promising potential candidate for long-life betavoltaic microbatteries used as power supplies for microelectromechanical system devices. (cross-disciplinary physics and related areas of science and technology)

  19. Poor concordance of spiral CT (SCT) and high probability ventilation-perfusion (V/Q) studies in the diagnosis of pulmonary embolism (PE)

    Roman, M.R.; Angelides, S.; Chen, N.

    2000-01-01

    Full text: Despite its limitations, V/Q scintigraphy remains the favoured non-invasive technique for the diagnosis of pulmonary embolism (PE). PE is present in 85-90% and 30-40% of high and intermediate probability V/Q studies respectively. The value of spiral CT (SCT), a newer imaging modality, has yet to be determined. The aims of this study were to determine the frequency of positive SCT for PE in high and intermediate probability V/Q studies performed within 24hr apart. 15 patients (6M, 9F, mean age - 70.2) with a high probability study were included. Six (40%) SCT were reported as positive (four with emboli present in the main pulmonary arteries), seven as negative, one equivocal and one was technically sub-optimal. Pulmonary angiography was not performed in any patient. In all seven negative studies, the SCT was performed before the V/Q study. Of these, two studies were revised to positive once the result of the V/Q study was known, while, three others had resolving mismatch V/Q defects on follow-up studies (performed 5-14 days later); two of these three also had a positive duplex scan of the lower limbs. One other was most likely due to chronic thromboembolic disease. Only three patients had a V/Q scan prior to the SCT; all were positive for PE on both imaging modalities. Of 26 patients (11M, 15F, mean age - 68.5) with an intermediate probability V/Q study, SCT was positive in only two (8%). Thus the low detection rate of PE by SCT in this albeit small series, raises doubts as to its role in the diagnosis of PE. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  20. Excluding joint probabilities from quantum theory

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  1. Development of Probability Evaluation Methodology for High Pressure/Temperature Gas Induced RCS Boundary Failure and SG Creep Rupture

    Lee, Byung Chul; Hong, Soon Joon; Lee, Jin Yong; Lee, Kyung Jin; Lee, Kuh Hyung [FNC Tech. Co., Seoul (Korea, Republic of)

    2008-04-15

    Existing MELCOR 1.8.5 model was improved in view of severe accident natural circulation and MELCOR 1.8.6 input model was developed and calculation sheets for detailed MELCOR 1.8.6 model were produced. Effects of natural circulation modeling were found by simulating SBO accident by comparing existing model with detailed model. Major phenomenon and system operations which affect on natural circulation by high temperature and high pressure gas were investigated and representative accident sequences for creep rupture model of RCS pipeline and SG tube were selected.

  2. On Probability Leakage

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  3. Gender Expression, Violence, and Bullying Victimization: Findings from Probability Samples of High School Students in 4 US School Districts

    Gordon, Allegra R.; Conron, Kerith J.; Calzo, Jerel P.; White, Matthew T.; Reisner, Sari L.; Austin, S. Bryn

    2018-01-01

    Background: Young people may experience school-based violence and bullying victimization related to their gender expression, independent of sexual orientation identity. However, the associations between gender expression and bullying and violence have not been examined in racially and ethnically diverse population-based samples of high school…

  4. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  5. Google Classroom and Open Clusters: An Authentic Science Research Project for High School Students

    Johnson, Chelen H.; Linahan, Marcella; Cuba, Allison Frances; Dickmann, Samantha Rose; Hogan, Eleanor B.; Karos, Demetra N.; Kozikowski, Kendall G.; Kozikowski, Lauren Paige; Nelson, Samantha Brooks; O'Hara, Kevin Thomas; Ropinski, Brandi Lucia; Scarpa, Gabriella; Garmany, Catharine D.

    2016-01-01

    STEM education is about offering unique opportunities to our students. For the past three years, students from two high schools (Breck School in Minneapolis, MN, and Carmel Catholic High School in Mundelein, IL) have collaborated on authentic astronomy research projects. This past year they surveyed archival data of open clusters to determine if a clear turnoff point could be unequivocally determined. Age and distance to each open cluster were calculated. Additionally, students requested time on several telescopes to obtain original data to compare to the archival data. Students from each school worked in collaborative teams, sharing and verifying results through regular online hangouts and chats. Work papers were stored in a shared drive and on a student-designed Google site to facilitate dissemination of documents between the two schools.

  6. Free probability and random matrices

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  7. Analysis of HIV-1 intersubtype recombination breakpoints suggests region with high pairing probability may be a more fundamental factor than sequence similarity affecting HIV-1 recombination.

    Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun

    2016-09-21

    With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our

  8. Changes in patellofemoral alignment do not cause clinical impact after open-wedge high tibial osteotomy.

    Lee, Yong Seuk; Lee, Sang Bok; Oh, Won Seok; Kwon, Yong Eok; Lee, Beom Koo

    2016-01-01

    The objectives of this study were (1) to evaluate the clinical and radiologic outcomes of open-wedge high tibial osteotomy focusing on patellofemoral alignment and (2) to search for correlation between variables and patellofemoral malalignment. A total of 46 knees (46 patients) from 32 females and 14 males who underwent open-wedge high tibial osteotomy were included in this retrospective case series. Outcomes were evaluated using clinical scales and radiologic parameters at the last follow-up. Pre-operative and final follow-up values were compared for the outcome analysis. For the focused analysis of the patellofemoral joint, correlation analyses between patellofemoral variables and pre- and post-operative weight-bearing line (WBL), clinical score, posterior slope, Blackburn Peel ratio, lateral patellar tilt, lateral patellar shift, and congruence angle were performed. The minimum follow-up period was 2 years and median follow-up period was 44 months (range 24-88 months). The percentage of weight-bearing line was shifted from 17.2 ± 11.1 to 56.7 ± 12.7%, and it was statistically significant (p patellofemoral malalignment, the pre-operative weight-bearing line showed an association with the change in lateral patellar tilt and lateral patellar shift (correlation coefficient: 0.3). After open-wedge high tibial osteotomy, clinical results showed improvement, compared to pre-operative values. The patellar tilt and lateral patellar shift were not changed; however, descent of the patella was observed. Therefore, mild patellofemoral problems should not be a contraindication of the open-wedge high tibial osteotomy. Case series, Level IV.

  9. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography

  10. High-uniformity centimeter-wide Si etching method for MEMS devices with large opening elements

    Okamoto, Yuki; Tohyama, Yukiya; Inagaki, Shunsuke; Takiguchi, Mikio; Ono, Tomoki; Lebrasseur, Eric; Mita, Yoshio

    2018-04-01

    We propose a compensated mesh pattern filling method to achieve highly uniform wafer depth etching (over hundreds of microns) with a large-area opening (over centimeter). The mesh opening diameter is gradually changed between the center and the edge of a large etching area. Using such a design, the etching depth distribution depending on sidewall distance (known as the local loading effect) inversely compensates for the over-centimeter-scale etching depth distribution, known as the global or within-die(chip)-scale loading effect. Only a single DRIE with test structure patterns provides a micro-electromechanical systems (MEMS) designer with the etched depth dependence on the mesh opening size as well as on the distance from the chip edge, and the designer only has to set the opening size so as to obtain a uniform etching depth over the entire chip. This method is useful when process optimization cannot be performed, such as in the cases of using standard conditions for a foundry service and of short turn-around-time prototyping. To demonstrate, a large MEMS mirror that needed over 1 cm2 of backside etching was successfully fabricated using as-is-provided DRIE conditions.

  11. Probability 1/e

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Probability an introduction

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  13. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  14. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  15. OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging

    Rubel, Oliver; Greiner, Annette; Cholia, Shreyas; Louie, Katherine; Bethel, E. Wes; Northen, Trent R.; Bowen, Benjamin P.

    2013-10-02

    Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data access (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.

  16. Deriving Animal Behaviour from High-Frequency GPS: Tracking Cows in Open and Forested Habitat.

    de Weerd, Nelleke; van Langevelde, Frank; van Oeveren, Herman; Nolet, Bart A; Kölzsch, Andrea; Prins, Herbert H T; de Boer, W Fred

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable classification method to infer behaviour from location data. Behavioural observations were carried out during tracking of cows (Bos Taurus) fitted with high-frequency GPS (Global Positioning System) receivers. Data were obtained in an open field and forested area, and movement metrics were calculated for 1 min, 12 s and 2 s intervals. We observed four behaviour types (Foraging, Lying, Standing and Walking). We subsequently used Classification and Regression Trees to classify the simultaneously obtained GPS data as these behaviour types, based on distances and turning angles between fixes. GPS data with a 1 min interval from the open field was classified correctly for more than 70% of the samples. Data from the 12 s and 2 s interval could not be classified successfully, emphasizing that the interval should be long enough for the behaviour to be defined by its characteristic movement metrics. Data obtained in the forested area were classified with a lower accuracy (57%) than the data from the open field, due to a larger positional error of GPS locations and differences in behavioural performance influenced by the habitat type. This demonstrates the importance of understanding the relationship between behaviour and movement metrics, derived from GNSS fixes at different frequencies and in different habitats, in order to successfully infer behaviour. When spatially accurate location data can be obtained, behaviour can be inferred from high-frequency GNSS fixes by calculating simple movement metrics and using easily interpretable decision trees. This allows for the combined study of animal behaviour and habitat use based on location data, and might make it possible to detect deviations

  17. Deriving Animal Behaviour from High-Frequency GPS: Tracking Cows in Open and Forested Habitat.

    Nelleke de Weerd

    Full Text Available The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable classification method to infer behaviour from location data. Behavioural observations were carried out during tracking of cows (Bos Taurus fitted with high-frequency GPS (Global Positioning System receivers. Data were obtained in an open field and forested area, and movement metrics were calculated for 1 min, 12 s and 2 s intervals. We observed four behaviour types (Foraging, Lying, Standing and Walking. We subsequently used Classification and Regression Trees to classify the simultaneously obtained GPS data as these behaviour types, based on distances and turning angles between fixes. GPS data with a 1 min interval from the open field was classified correctly for more than 70% of the samples. Data from the 12 s and 2 s interval could not be classified successfully, emphasizing that the interval should be long enough for the behaviour to be defined by its characteristic movement metrics. Data obtained in the forested area were classified with a lower accuracy (57% than the data from the open field, due to a larger positional error of GPS locations and differences in behavioural performance influenced by the habitat type. This demonstrates the importance of understanding the relationship between behaviour and movement metrics, derived from GNSS fixes at different frequencies and in different habitats, in order to successfully infer behaviour. When spatially accurate location data can be obtained, behaviour can be inferred from high-frequency GNSS fixes by calculating simple movement metrics and using easily interpretable decision trees. This allows for the combined study of animal behaviour and habitat use based on location data, and might make it possible to

  18. Quantum probability measures and tomographic probability densities

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  19. Applied probability and stochastic processes

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Scalable High Performance Message Passing over InfiniBand for Open MPI

    Friedley, A; Hoefler, T; Leininger, M L; Lumsdaine, A

    2007-10-24

    InfiniBand (IB) is a popular network technology for modern high-performance computing systems. MPI implementations traditionally support IB using a reliable, connection-oriented (RC) transport. However, per-process resource usage that grows linearly with the number of processes, makes this approach prohibitive for large-scale systems. IB provides an alternative in the form of a connectionless unreliable datagram transport (UD), which allows for near-constant resource usage and initialization overhead as the process count increases. This paper describes a UD-based implementation for IB in Open MPI as a scalable alternative to existing RC-based schemes. We use the software reliability capabilities of Open MPI to provide the guaranteed delivery semantics required by MPI. Results show that UD not only requires fewer resources at scale, but also allows for shorter MPI startup times. A connectionless model also improves performance for applications that tend to send small messages to many different processes.

  1. Exploring Infiniband Hardware Virtualization in OpenNebula towards Efficient High-Performance Computing

    Pais Pitta de Lacerda Ruivo, Tiago [IIT, Chicago; Bernabeu Altayo, Gerard [Fermilab; Garzoglio, Gabriele [Fermilab; Timm, Steven [Fermilab; Kim, Hyun-Woo [Fermilab; Noh, Seo-Young [KISTI, Daejeon; Raicu, Ioan [IIT, Chicago

    2014-11-11

    has been widely accepted that software virtualization has a big negative impact on high-performance computing (HPC) application performance. This work explores the potential use of Infiniband hardware virtualization in an OpenNebula cloud towards the efficient support of MPI-based workloads. We have implemented, deployed, and tested an Infiniband network on the FermiCloud private Infrastructure-as-a-Service (IaaS) cloud. To avoid software virtualization towards minimizing the virtualization overhead, we employed a technique called Single Root Input/Output Virtualization (SRIOV). Our solution spanned modifications to the Linux’s Hypervisor as well as the OpenNebula manager. We evaluated the performance of the hardware virtualization on up to 56 virtual machines connected by up to 8 DDR Infiniband network links, with micro-benchmarks (latency and bandwidth) as well as w a MPI-intensive application (the HPL Linpack benchmark).

  2. A narrow open tubular column for high efficiency liquid chromatographic separation.

    Chen, Huang; Yang, Yu; Qiao, Zhenzhen; Xiang, Piliang; Ren, Jiangtao; Meng, Yunzhu; Zhang, Kaiqi; Juan Lu, Joann; Liu, Shaorong

    2018-04-30

    We report a great feature of open tubular liquid chromatography when it is run using an extremely narrow (e.g., 2 μm inner diameter) open tubular column: more than 10 million plates per meter can be achieved in less than 10 min and under an elution pressure of ca. 20 bar. The column is coated with octadecylsilane and both isocratic and gradient separations are performed. We reveal a focusing effect that may be used to interpret the efficiency enhancement. We also demonstrate the feasibility of using this technique for separating complex peptide samples. This high-resolution and fast separation technique is promising and can lead to a powerful tool for trace sample analysis.

  3. CellProfiler and KNIME: open source tools for high content screening.

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  4. Long-term outcome of high-energy open Lisfranc injuries: a retrospective study.

    Nithyananth, Manasseh; Boopalan, Palapattu R J V C; Titus, Vijay T K; Sundararaj, Gabriel D; Lee, Vernon N

    2011-03-01

    The outcome of open Lisfranc injuries has been reported infrequently. Should these injuries be managed as closed injuries and is their outcome different? We undertook a retrospective study of high-energy, open Lisfranc injuries treated between 1999 and 2005. The types of dislocation, the associated injuries to the same foot, the radiologic and functional outcome, and the complications were studied. There were 22 patients. Five patients died. One had amputation. Of the remaining 16 patients, 13 men were followed up at a mean of 56 months (range, 29-88 months). The average age was 36 years (range, 7-55 years). According to the modified Hardcastle classification, type B2 injury was the commonest. Ten patients had additional forefoot or midfoot injury. All patients were treated with debridement, open reduction, and multiple Kirschner (K) wire fixation. All injuries were Gustilo Anderson type IIIa or IIIb. Nine patients had split skin graft for soft tissue cover. Mean time taken for wound healing was 16 days (range, 10-30 days). Ten patients (77%) had fracture comminution. Eight patients had anatomic reduction, whereas five had nonanatomic reduction. Ten of 13 (77%) patients had at least one spontaneous tarsometatarsal joint fusion. The mean American Orthopaedic Foot and Ankle Society score was 82 (range, 59-100). Nonanatomic reduction, osteomyelitis, deformity of toes, planus foot, and mild discomfort on prolonged walking were the unfavorable outcomes present. In open Lisfranc injuries, multiple K wire fixation should be considered especially in the presence of comminution and soft tissue loss. Although anatomic reduction is always not obtained, the treatment principles should include adequate debridement, maintaining alignment with multiple K wires, and obtaining early soft tissue cover. There is a high incidence of fusion across tarsometatarsal joints. Copyright © 2011 by Lippincott Williams & Wilkins

  5. The results of high tibial open wedge osteotomy in patients with varus deformity

    Mahmood Jabalameli

    2013-07-01

    Full Text Available Background: High tibial open wedg osteotomy is one of the most important modality for treatment of varus deformity in order to correct deformity and improving signs and symptoms of patients with primary degenerative osteoarthritis. The aim of this study was to investigate the results of high tibial open wedge osteotomy in patients with varus deformities.Methods: This retrospective study conducted on twenty nine patients (36 knees undergone proximal tibial osteotomy operation in Shafa Yahyaian University Hospital from 2004 to 2010. Inclusion criteria were: age less than 60 years, high physical activity, varus deformity and involvement of medical compartment of knee. Patients with obesity, smoking, patelofemoral pain, lateral compartment lesion, deformity degree more than 20 degree, extension limitation and range of motion less than 90 degree were excluded. The clinical and radiologic characteristics were measured before and after operation.Results: Fourteen patients were females. All of them were younger than 50 years, with mean (±SD 27.64 (±10.88. The mean (±SD of follow up time was 4.33 (±1.7. All the patients were satisfied with the results of operation. Tenderness and pain decreased in all of them. In all patients autologus bone graft were used, in 15 cases (42.5% casting and in the rest T.Buttress plate were used for fixation of fractures. In both groups of primary and double varus the International knee documentation committee (IKDC and modified Larson indices were improved after operation, but there was no significant difference between two groups.Conclusion: High tibial open wedge osteotomy can have satisfying results in clinical signs and symptoms of patients with primary medial joint degenerative osteoarthritis. This procedure also may correct the deformity and improves the radiologic parameters of the patients.

  6. A sustainable business model for Open-Access journal publishing a proposed plan for High-Energy Physics

    Vigen, Jens

    2007-01-01

    The High Energy Physics community over the last 15 years has achieved so-called full green Open Access through the wide dissemination of preprints via arXiv, a central subject repository managed by Cornell University. However, green Open Access does not alleviate the economic difficulties of libraries as they are still expected to offer access to versions of record of the peer-reviewed literature. For this reason the particle physics community is now addressing the issue of gold Open Access by converting a set of the existing core journals to Open Access. A Working Party has been established to bring together funding agencies, laboratories and libraries into a single consortium, called SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). This consortium will engage with publishers to build a sustainable model for Open Access publishing. In this model, subscription fees from multiple institutions are replaced by contracts with publishers of Open Access journals, where the SCOAP3 conso...

  7. Toward a generalized probability theory: conditional probabilities

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  8. Finite Element Analysis of Dam-Reservoir Interaction Using High-Order Doubly Asymptotic Open Boundary

    Yichao Gao

    2011-01-01

    Full Text Available The dam-reservoir system is divided into the near field modeled by the finite element method, and the far field modeled by the excellent high-order doubly asymptotic open boundary (DAOB. Direct and partitioned coupled methods are developed for the analysis of dam-reservoir system. In the direct coupled method, a symmetric monolithic governing equation is formulated by incorporating the DAOB with the finite element equation and solved using the standard time-integration methods. In contrast, the near-field finite element equation and the far-field DAOB condition are separately solved in the partitioned coupled methodm, and coupling is achieved by applying the interaction force on the truncated boundary. To improve its numerical stability and accuracy, an iteration strategy is employed to obtain the solution of each step. Both coupled methods are implemented on the open-source finite element code OpenSees. Numerical examples are employed to demonstrate the performance of these two proposed methods.

  9. Thermocleavable Materials for Polymer Solar Cells with High Open Circuit Voltage-A Comparative Study

    Tromholt, Thomas; Gevorgyan, Suren; Jørgensen, Mikkel

    2009-01-01

    The search for polymer solar cells giving a high open circuit voltage was conducted through a comparative study of four types of bulk-heterojunction solar cells employing different photoactive layers. As electron donors the thermo-cleavable polymer poly-(3-(2-methylhexyloxycarbonyl)dithiophene) (P3......MHOCT) and unsubstituted polythiophene (PT) were used, the latter of which results from thermo cleaving the former at 310 °C. As reference, P3HT solar cells were built in parallel. As electron acceptors, either PCBM or bis-[60]PCBM were used. In excess of 300 solar cells were produced under as identical...... conditions as possible, varying only the material combination of the photo active layer. It was observed that on replacing PCBM with bis[60]PCBM, the open circuit voltage on average increased by 100 mV for P3MHOCT and 200 mV for PT solar cells. Open circuit voltages approaching 1 V were observed for the PT:bis...

  10. Posterior Probability Matching and Human Perceptual Decision Making.

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  11. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  12. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota

    Kaushi S. T. Kanankege

    2018-01-01

    Full Text Available Zebra mussels (ZMs (Dreissena polymorpha and Eurasian watermilfoil (EWM (Myriophyllum spicatum are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78. The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76. Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45% or EWM (12.43% invasions, whereas only 125/18,411 (0.67% and 304/18,411 (1.65% are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  13. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota.

    Kanankege, Kaushi S T; Alkhamis, Moh A; Phelps, Nicholas B D; Perez, Andres M

    2017-01-01

    Zebra mussels (ZMs) ( Dreissena polymorpha ) and Eurasian watermilfoil (EWM) ( Myriophyllum spicatum ) are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78). The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76). Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45%) or EWM (12.43%) invasions, whereas only 125/18,411 (0.67%) and 304/18,411 (1.65%) are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  14. Open charm production at high energies and the quark Reggeization hypothesis

    Kniehl, B.A.; Shipilova, A.V.

    2008-12-01

    We study open charm production at high energies in the framework of the quasi-multi-Regge-kinematics approach applying the quark-Reggeization hypothesis implemented with Reggeon-Reggeon-particle and Reggeon-particle-particle effective vertices. Adopting the Kimber-Martin-Ryskin unintegrated quark and gluon distribution functions of the proton and photon, we thus nicely describe the proton structure function F 2,c measured at DESY HERA as well as the transverse-momentum distributions of D mesons created by photoproduction at HERA and by hadroproduction at the Fermilab Tevatron. (orig.)

  15. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  16. Three-phase multilevel inverter configuration for open-winding high power application

    Sanjeevikumar, Padmanaban; Blaabjerg, Frede; Wheeler, Patrick William

    2015-01-01

    This paper work exploits a new dual open-winding three-phase multilevel inverter configuration suitable for high power medium-voltage applications. Modular structure comprised of standard three-phase voltage source inverter (VSI) along with one additional bi-directional semiconductor device (MOSFET...... for implementation purpose. Proposed dual-inverter configuration generates multilevel outputs with benefit includes reduced THD and dv/dt in comparison to other dual-inverter topologies. Complete model of the multilevel ac drive is developed with simple MSCFM modulation in Matlab/PLECs numerical software...

  17. Diagnostic accuracy of the MMSE in detecting probable and possible Alzheimer's disease in ethnically diverse highly educated individuals: an analysis of the NACC database.

    Spering, Cynthia C; Hobson, Valerie; Lucas, John A; Menon, Chloe V; Hall, James R; O'Bryant, Sid E

    2012-08-01

    To validate and extend the findings of a raised cut score of O'Bryant and colleagues (O'Bryant SE, Humphreys JD, Smith GE, et al. Detecting dementia with the mini-mental state examination in highly educated individuals. Arch Neurol. 2008;65(7):963-967.) for the Mini-Mental State Examination in detecting cognitive dysfunction in a bilingual sample of highly educated ethnically diverse individuals. Archival data were reviewed from participants enrolled in the National Alzheimer's Coordinating Center minimum data set. Data on 7,093 individuals with 16 or more years of education were analyzed, including 2,337 cases with probable and possible Alzheimer's disease, 1,418 mild cognitive impairment patients, and 3,088 nondemented controls. Ethnic composition was characterized as follows: 6,296 Caucasians, 581 African Americans, 4 American Indians or Alaska natives, 2 native Hawaiians or Pacific Islanders, 149 Asians, 43 "Other," and 18 of unknown origin. Diagnostic accuracy estimates (sensitivity, specificity, and likelihood ratio) of Mini-Mental State Examination cut scores in detecting probable and possible Alzheimer's disease were examined. A standard Mini-Mental State Examination cut score of 24 (≤23) yielded a sensitivity of 0.58 and a specificity of 0.98 in detecting probable and possible Alzheimer's disease across ethnicities. A cut score of 27 (≤26) resulted in an improved balance of sensitivity and specificity (0.79 and 0.90, respectively). In the cognitively impaired group (mild cognitive impairment and probable and possible Alzheimer's disease), the standard cut score yielded a sensitivity of 0.38 and a specificity of 1.00 while raising the cut score to 27 resulted in an improved balance of 0.59 and 0.96 of sensitivity and specificity, respectively. These findings cross-validate our previous work and extend them to an ethnically diverse cohort. A higher cut score is needed to maximize diagnostic accuracy of the Mini-Mental State Examination in individuals

  18. Highly Active N,O Zinc Guanidine Catalysts for the Ring-Opening Polymerization of Lactide.

    Schäfer, Pascal M; Fuchs, Martin; Ohligschläger, Andreas; Rittinghaus, Ruth; McKeown, Paul; Akin, Enver; Schmidt, Maximilian; Hoffmann, Alexander; Liauw, Marcel A; Jones, Matthew D; Herres-Pawlis, Sonja

    2017-09-22

    New zinc guanidine complexes with N,O donor functionalities were prepared, characterized by X-Ray crystallography, and examined for their catalytic activity in the solvent-free ring-opening polymerization (ROP) of technical-grade rac-lactide at 150 °C. All complexes showed a high activity. The fastest complex [ZnCl 2 (DMEGasme)] (C1) produced colorless poly(lactide) (PLA) after 90 min with a conversion of 52 % and high molar masses (M w =69 100, polydispersity=1.4). The complexes were tested with different monomer-to-initiator ratios to determine the rate constant k p . Furthermore, a polymerization with the most active complex C1 was monitored by in situ Raman spectroscopy. Overall, conversion of up to 90 % can be obtained. End-group analysis was performed to clarify the mechanism. All four complexes combine robustness against impurities in the lactide with high polymerization rates, and they represent the fastest robust lactide ROP catalysts to date, opening new avenues to a sustainable ROP catalyst family for industrial use. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  20. TERRA REF: Advancing phenomics with high resolution, open access sensor and genomics data

    LeBauer, D.; Kooper, R.; Burnette, M.; Willis, C.

    2017-12-01

    Automated plant measurement has the potential to improve understanding of genetic and environmental controls on plant traits (phenotypes). The application of sensors and software in the automation of high throughput phenotyping reflects a fundamental shift from labor intensive hand measurements to drone, tractor, and robot mounted sensing platforms. These tools are expected to speed the rate of crop improvement by enabling plant breeders to more accurately select plants with improved yields, resource use efficiency, and stress tolerance. However, there are many challenges facing high throughput phenomics: sensors and platforms are expensive, currently there are few standard methods of data collection and storage, and the analysis of large data sets requires high performance computers and automated, reproducible computing pipelines. To overcome these obstacles and advance the science of high throughput phenomics, the TERRA Phenotyping Reference Platform (TERRA-REF) team is developing an open-access database of high resolution sensor data. TERRA REF is an integrated field and greenhouse phenotyping system that includes: a reference field scanner with fifteen sensors that can generate terrabytes of data each day at mm resolution; UAV, tractor, and fixed field sensing platforms; and an automated controlled-environment scanner. These platforms will enable investigation of diverse sensing modalities, and the investigation of traits under controlled and field environments. It is the goal of TERRA REF to lower the barrier to entry for academic and industry researchers by providing high-resolution data, open source software, and online computing resources. Our project is unique in that all data will be made fully public in November 2018, and is already available to early adopters through the beta-user program. We will describe the datasets and how to use them as well as the databases and computing pipeline and how these can be reused and remixed in other phenomics pipelines

  1. Open wedge high tibial osteotomy using three-dimensional printed models: Experimental analysis using porcine bone.

    Kwun, Jun-Dae; Kim, Hee-June; Park, Jaeyoung; Park, Il-Hyung; Kyung, Hee-Soo

    2017-01-01

    The purpose of this study was to evaluate the usefulness of three-dimensional (3D) printed models for open wedge high tibial osteotomy (HTO) in porcine bone. Computed tomography (CT) images were obtained from 10 porcine knees and 3D imaging was planned using the 3D-Slicer program. The osteotomy line was drawn from the three centimeters below the medial tibial plateau to the proximal end of the fibular head. Then the osteotomy gap was opened until the mechanical axis line was 62.5% from the medial border along the width of the tibial plateau, maintaining the posterior tibial slope angle. The wedge-shaped 3D-printed model was designed with the measured angle and osteotomy section and was produced by the 3D printer. The open wedge HTO surgery was reproduced in porcine bone using the 3D-printed model and the osteotomy site was fixed with a plate. Accuracy of osteotomy and posterior tibial slope was evaluated after the osteotomy. The mean mechanical axis line on the tibial plateau was 61.8±1.5% from the medial tibia. There was no statistically significant difference (P=0.160). The planned and post-osteotomy correction wedge angles were 11.5±3.2° and 11.4±3.3°, and the posterior tibial slope angle was 11.2±2.2° pre-osteotomy and 11.4±2.5° post-osteotomy. There were no significant differences (P=0.854 and P=0.429, respectively). This study showed that good results could be obtained in high tibial osteotomy by using 3D printed models of porcine legs. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Uncertainty of soil erosion modelling using open source high resolution and aggregated DEMs

    Arun Mondal

    2017-05-01

    Full Text Available Digital Elevation Model (DEM is one of the important parameters for soil erosion assessment. Notable uncertainties are observed in this study while using three high resolution open source DEMs. The Revised Universal Soil Loss Equation (RUSLE model has been applied to analysis the assessment of soil erosion uncertainty using open source DEMs (SRTM, ASTER and CARTOSAT and their increasing grid space (pixel size from the actual. The study area is a part of the Narmada river basin in Madhya Pradesh state, which is located in the central part of India and the area covered 20,558 km2. The actual resolution of DEMs is 30 m and their increasing grid spaces are taken as 90, 150, 210, 270 and 330 m for this study. Vertical accuracy of DEMs has been assessed using actual heights of the sample points that have been taken considering planimetric survey based map (toposheet. Elevations of DEMs are converted to the same vertical datum from WGS 84 to MSL (Mean Sea Level, before the accuracy assessment and modelling. Results indicate that the accuracy of the SRTM DEM with the RMSE of 13.31, 14.51, and 18.19 m in 30, 150 and 330 m resolution respectively, is better than the ASTER and the CARTOSAT DEMs. When the grid space of the DEMs increases, the accuracy of the elevation and calculated soil erosion decreases. This study presents a potential uncertainty introduced by open source high resolution DEMs in the accuracy of the soil erosion assessment models. The research provides an analysis of errors in selecting DEMs using the original and increased grid space for soil erosion modelling.

  3. Falcon: a highly flexible open-source software for closed-loop neuroscience

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Objective. Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. Approach. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. Main results. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Significance. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those

  4. Falcon: a highly flexible open-source software for closed-loop neuroscience.

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those requiring use of complex data structures and real

  5. Exploring Differences between Self-Regulated Learning Strategies of High and Low Achievers in Open Distance Learning

    Geduld, Bernadette

    2016-01-01

    Open distance students differ in their preparedness for higher education studies. Students who are less self-regulated risk failure and drop out in the challenging milieu of open distance learning. In this study, the differences between the application of self-regulated learning strategies by low and high achievers were explored. A multi-method…

  6. The link between organisational citizenship behaviours and open innovation: A case of Malaysian high-tech sector

    M. Muzamil Naqshbandi

    2016-12-01

    Full Text Available We examine the role of organisational citizenship behaviours (OCBs in two types of open innovation—inbound and outbound. Data were collected using the questionnaire survey technique from middle and top managers working in high-tech industries in Malaysia. Results show that OCBs positively predict both inbound and outbound open innovation. A closer look reveals that OCBs relate positively to out-bound open innovation in aggregate and in isolation. However, OCBs relate to in-bound open innovation in aggregate only. The implications of these results are discussed and limitations of the study are highlighted.

  7. High School Coaches' Experiences With Openly Lesbian, Gay, and Bisexual Athletes.

    Halbrook, Meghan K; Watson, Jack C; Voelker, Dana K

    2018-01-17

    Despite reports that there has been a positive trend in perception and treatment of lesbian, gay, and bisexual (LGB) individuals in recent years (Griffin, 2012 ; Loftus, 2001 ), sport, in general, is still an uncertain, and sometimes even hostile, environment for LGB athletes (Anderson, 2005 ; Waldron & Krane, 2005 ). To gain more information on coach understanding and perceptions of the team environment, 10 high school head coaches in the United States were interviewed to explore their experiences coaching openly LGB athletes. Qualitative analyses revealed four primary themes associated with coach experiences: team environment dogmas and observations, fundamental beliefs contributing to perceptions of LGB athletes, types and timing of sexual orientation disclosure, and differential LGB athlete characteristics. Future research should examine these primary themes in more detail through interviews with LGB athletes, as well as high school coaches in more traditionally masculine sports, such as football, men's basketball, and wrestling.

  8. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements. (paper)

  9. Philosophical theories of probability

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  10. Non-Archimedean Probability

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  11. Interpretations of probability

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Causality between trade openness and energy consumption: What causes what in high, middle and low income countries

    Shahbaz, Muhammad; Nasreen, Samia; Ling, Chong Hui; Sbia, Rashid

    2014-01-01

    This paper explores the relationship between trade openness and energy consumption using data of 91 high, middle and low income countries. The study covers the period of 1980–2010. We have applied panel cointegration to examine long run relationship between the variables. The direction of causal relationship between trade openness is investigated by applying Homogenous non-causality, Homogenous causality and Heterogeneous causality tests. Our variables are integrated at I(1) confirmed by time series and panel unit root tests and cointegration is found between trade openness and energy consumption. The relationship between trade openness and energy consumption is inverted U-shaped in high income countries but U-shaped in middle and low income countries. The homogenous and non-homogenous causality analysis reveals the bidirectional causality between trade openness and energy consumption. This paper opens up new insights for policy makers to design a comprehensive economic, trade and policies for sustainable economic growth in long run following heterogeneous causality findings. - Highlights: • Trade openness and energy consumption are cointegrated for long run. • The feedback effect exists between trade openness and energy consumption. • The inverted U-shaped relationship is found between both variables in high income countries

  13. Comparison of Wells and Revised Geneva Rule to Assess Pretest Probability of Pulmonary Embolism in High-Risk Hospitalized Elderly Adults.

    Di Marca, Salvatore; Cilia, Chiara; Campagna, Andrea; D'Arrigo, Graziella; Abd ElHafeez, Samar; Tripepi, Giovanni; Puccia, Giuseppe; Pisano, Marcella; Mastrosimone, Gianluca; Terranova, Valentina; Cardella, Antonella; Buonacera, Agata; Stancanelli, Benedetta; Zoccali, Carmine; Malatino, Lorenzo

    2015-06-01

    To assess and compare the diagnostic power for pulmonary embolism (PE) of Wells and revised Geneva scores in two independent cohorts (training and validation groups) of elderly adults hospitalized in a non-emergency department. Prospective clinical study, January 2011 to January 2013. Unit of Internal Medicine inpatients, University of Catania, Italy. Elderly adults (mean age 76 ± 12), presenting with dyspnea or chest pain and with high clinical probability of PE or D-dimer values greater than 500 ng/mL (N = 203), were enrolled and consecutively assigned to a training (n = 101) or a validation (n = 102) group. The clinical probability of PE was assessed using Wells and revised Geneva scores. Clinical examination, D-dimer test, and multidetector computed angiotomography were performed in all participants. The accuracy of the scores was assessed using receiver operating characteristic analyses. PE was confirmed in 46 participants (23%) (24 training group, 22 validation group). In the training group, the area under the receiver operating characteristic curve was 0.91 (95% confidence interval (CI) = 0.85-0.98) for the Wells score and 0.69 (95% CI = 0.56-0.82) for the revised Geneva score (P < .001). These results were confirmed in the validation group (P < .05). The positive (LR+) and negative likelihood ratios (LR-) (two indices combining sensitivity and specificity) of the Wells score were superior to those of the revised Geneva score in the training (LR+, 7.90 vs 1.34; LR-, 0.23 vs 0.66) and validation (LR+, 13.5 vs 1.46; LR-, 0.47 vs 0.54) groups. In high-risk elderly hospitalized adults, the Wells score is more accurate than the revised Geneva score for diagnosing PE. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  14. Introduction to probability theory with contemporary applications

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  15. Probability with applications and R

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  16. MANTA--an open-source, high density electrophysiology recording suite for MATLAB.

    Englitz, B; David, S V; Sorenson, M D; Shamma, S A

    2013-01-01

    The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to decipher the neural code, whether single cell, local field potential (LFP), micro-electrocorticograms (μECoG), electroencephalographic (EEG), magnetoencephalographic (MEG) or in vitro micro-electrode array (MEA) data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ) systems with high channel counts (>100) can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users. We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC), usage of analog or digital headstages, low per channel cost (1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays), some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems. With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point.

  17. MANTA – An Open-Source, High Density Electrophysiology Recording Suite for MATLAB

    Bernhard eEnglitz

    2013-05-01

    Full Text Available The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to break the neural code, whether single cell, local field potential (LFP, micro-electrocorticograms (μECoG, electroencephalographic (EEG, magnetoencephalographic (MEG or in vitro micro-electrode array (MEA data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ systems with high channel counts (>100 can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users.We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC, usage of analog or digital headstages, low per channel cost (<$90/channel, feature-rich display & filtering, a user-friendly interface, and a modular design permitting easy addition of new features. MANTA is licensed under the GPL and free of charge. The system has been tested by daily use in multiple setups for >1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays, some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems.With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point.

  18. Mobile Measurements of Methane Using High-Speed Open-Path Technology

    Burba, G. G.; Anderson, T.; Ediger, K.; von Fischer, J.; Gioli, B.; Ham, J. M.; Hupp, J. R.; Kohnert, K.; Levy, P. E.; Polidori, A.; Pikelnaya, O.; Price, E.; Sachs, T.; Serafimovich, A.; Zondlo, M. A.; Zulueta, R. C.

    2016-12-01

    Methane plays a critical role in the radiation balance, chemistry of the atmosphere, and air quality. The major anthropogenic sources of CH4 include oil and gas development sites, natural gas distribution networks, landfill emissions, and agricultural production. The majority of oil and gas and urban CH4 emission occurs via variable-rate point sources or diffused spots in topographically challenging terrains (e.g., street tunnels, elevated locations at water treatment plants, vents, etc.). Locating and measuring such CH4 emissions is challenging when using traditional micrometeorological techniques, and requires development of novel approaches. Landfill CH4 emissions traditionally assessed at monthly or longer time intervals are subject to large uncertainties because of the snapshot nature of the measurements and the barometric pumping phenomenon. The majority of agricultural and natural CH4 production occurs in areas with little infrastructure or easily available grid power (e.g., rice fields, arctic and boreal wetlands, tropical mangroves, etc.). A lightweight, high-speed, high-resolution, open-path technology was recently developed for eddy covariance measurements of CH4 flux, with power consumption 30-150 times below other available technologies. It was designed to run on solar panels or a small generator and be placed in the middle of the methane-producing ecosystem without a need for grid power. Lately, this instrumentation has been utilized increasingly more frequently outside of the traditional use on stationary flux towers. These novel approaches include measurements from various moving platforms, such as cars, aircraft, and ships. Projects included mapping of concentrations and vertical profiles, leak detection and quantification, mobile emission detection from natural gas-powered cars, soil CH4 flux surveys, etc. This presentation will describe key projects utilizing the novel lightweight low-power high-resolution open-path technology, and will highlight

  19. OpenCL-Based Linear Algebra Libraries for High-Performance Computing, Phase I

    National Aeronautics and Space Administration — Despite its promise, OpenCL adoption is slow, owing to a lack of libraries and tools. Vendors have shown few signs of plans to provide OpenCL libraries, and were...

  20. Open Access Publishing in High-Energy Physics the SCOAP$^{3}$ model

    Mele, S

    2009-01-01

    The Open Access (OA) movement is gaining an increasing momentum: its goal is to grant anyone, anywhere and anytime free access to the results of publicly funded scientific research. The High- Energy Physics (HEP) community has pioneered OA for decades, through its widespread “pre-print culture”. After almost half a century of worldwide dissemination of pre-prints, in paper first and electronically later, OA journals are becoming the natural evolution of scholarly communication in HEP. Among other OA business models, the one based on a sponsoring consortium appears as the most viable option for a transition of the HEP peer-reviewed literature to OA. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is proposed as a central body to remunerate publishers for their peer-review service, effectively replacing the “reader-pays” model of traditional subscriptions with an “author-side” funding, without any direct financial burden on individual authors and research groups. Su...

  1. Realizing the increased potential of an open-system high-definition digital projector design

    Daniels, Reginald

    1999-05-01

    Modern video projectors are becoming more compact and capable. Various display technologies are very competitive and are delivering higher performance and more compact projectors to market at an ever quickening pace. However the end users are often left with the daunting task of integrating the 'off the self projectors' into a previously existing system. As the projectors become more digitally enhanced, there will be a series of designs, and the digital projector technology matures. The design solutions will be restricted by the state of the art at the time of manufacturing. In order to allow the most growth and performance for a given price, many design decisions will be made and revisited over a period of years or decades. A modular open digital system design concept is indeed a major challenge of the future high definition digital displays for al applications.

  2. Highly Defined Multiblock Copolypeptoids: Pushing the Limits of Living Nucleophilic Ring-Opening Polymerization

    Fetsch, Corinna

    2012-06-05

    Advanced macromolecular engineering requires excellent control over the polymerization reaction. Living polymerization methods are notoriously sensitive to impurities, which makes a practical realization of such control very challenging. Reversible-deactivation radical polymerization methods are typically more robust, but have other limitations. Here, we demonstrate by repeated (ge;10 times) chain extension the extraordinary robustness of the living nucleophilic ring-opening polymerization of N-substituted glycine N-carboxyanhydrides, which yields polypeptoids. We observe essentially quantitative end-group fidelity under experimental conditions that are comparatively easily managed. This is employed to synthesize a pentablock quinquiespolymer with high definition. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Crossfostering in mice selectively bred for high and low levels of open-field thigmotaxis.

    Leppänen, Pia K; Ewalds-Kvist, S Béatrice M

    2005-02-01

    The main purpose of this research was to investigate whether the difference in open-field (OF) thigmotaxis between mice selectively bred for high and low levels of wall-seeking behavior originated from genetic or acquired sources. Unfostered, infostered, and crossfostered mice were compared in two experiments in which the effects of strain, sex, and fostering on ambulation, defecation, exploration, grooming, latency to move, radial latency, rearing, thigmotaxis, and urination were studied. These experiments revealed that OF thigmotaxis was unaffected by the foster condition and thus genetically determined. The selected strains of mice also diverged repeatedly with regard to exploration and rearing. The findings are in line with the previously described existence of an inverse relationship between emotionality and exploration.

  4. Highly Defined Multiblock Copolypeptoids: Pushing the Limits of Living Nucleophilic Ring-Opening Polymerization

    Fetsch, Corinna; Luxenhofer, Robert

    2012-01-01

    Advanced macromolecular engineering requires excellent control over the polymerization reaction. Living polymerization methods are notoriously sensitive to impurities, which makes a practical realization of such control very challenging. Reversible-deactivation radical polymerization methods are typically more robust, but have other limitations. Here, we demonstrate by repeated (ge;10 times) chain extension the extraordinary robustness of the living nucleophilic ring-opening polymerization of N-substituted glycine N-carboxyanhydrides, which yields polypeptoids. We observe essentially quantitative end-group fidelity under experimental conditions that are comparatively easily managed. This is employed to synthesize a pentablock quinquiespolymer with high definition. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  6. Development of highly open polyhedral networks from vitreous carbon for orthopaedic applications

    Güiza-Argüello, V.; Bayona-Becerra, M.; Cruz-Orellana, S.; Córdoba-Tuta, E.

    2017-01-01

    Highly open polyhedral networks were fabricated using an economical and environmentally friendly template route. Recycled cellulose foams were impregnated with a sucrose resin and then pyrolyzed in order to produce reticulated vitreous carbon foams with morphological features that closely resemble trabecular bone. Also, cell sizes ~1mm were achieved, a trait that will allow the mechanical reinforcement of such scaffolds using a biomaterial coating without compromising the pore size that favors osteoblast cell infiltration and growth (200-500µm). Moreover, initial studies showed that carbonization conditions have an effect on the mechanical properties of the synthesized foams and, therefore, such process parameters could be further evaluated towards the enhancement of the mechanical resistance of the scaffolds. The materials developed here are visualized as the porous component of a synthetic bone graft with features that could help overcome the current limitations associated with the medical treatments used for bone defect repair.

  7. Large-eddy simulation of convective boundary layer generated by highly heated source with open source code, OpenFOAM

    Hattori, Yasuo; Suto, Hitoshi; Eguchi, Yuzuru; Sano, Tadashi; Shirai, Koji; Ishihara, Shuji

    2011-01-01

    Spatial- and temporal-characteristics of turbulence structures in the close vicinity of a heat source, which is a horizontal upward-facing round plate heated at high temperature, are examined by using well resolved large-eddy simulations. The verification is carried out through the comparison with experiments: the predicted statistics, including the PDF distribution of temperature fluctuations, agree well with measurements, indicating that the present simulations have a capability to appropriately reproduce turbulence structures near the heat source. The reproduced three-dimensional thermal- and fluid-fields in the close vicinity of the heat source reveals developing processes of coherence structures along the surface: the stationary- and streaky-flow patterns appear near the edge, and such patterns randomly shift to cell-like patterns with incursion into the center region, resulting in thermal-plume meandering. Both the patterns have very thin structures, but the depth of streaky structure is considerably small compared with that of cell-like patterns; this discrepancy causes the layered structures. The structure is the source of peculiar turbulence characteristics, the prediction of which is quite difficult with RANS-type turbulence models. The understanding such structures obtained in present study must be helpful to improve the turbulence model used in nuclear engineering. (author)

  8. Foundations of probability

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  9. Building highly available control system applications with Advanced Telecom Computing Architecture and open standards

    Kazakov, Artem; Furukawa, Kazuro

    2010-01-01

    Requirements for modern and future control systems for large projects like International Linear Collider demand high availability for control system components. Recently telecom industry came up with a great open hardware specification - Advanced Telecom Computing Architecture (ATCA). This specification is aimed for better reliability, availability and serviceability. Since its first market appearance in 2004, ATCA platform has shown tremendous growth and proved to be stable and well represented by a number of vendors. ATCA is an industry standard for highly available systems. On the other hand Service Availability Forum, a consortium of leading communications and computing companies, describes interaction between hardware and software. SAF defines a set of specifications such as Hardware Platform Interface, Application Interface Specification. SAF specifications provide extensive description of highly available systems, services and their interfaces. Originally aimed for telecom applications, these specifications can be used for accelerator controls software as well. This study describes benefits of using these specifications and their possible adoption to accelerator control systems. It is demonstrated how EPICS Redundant IOC was extended using Hardware Platform Interface specification, which made it possible to utilize benefits of the ATCA platform.

  10. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  11. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  12. Low-Probability High-Consequence (LPHC) Failure Events in Geologic Carbon Sequestration Pipelines and Wells: Framework for LPHC Risk Assessment Incorporating Spatial Variability of Risk

    Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Budnitz, Robert J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-31

    If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO2 annually, with the CO2 delivered to many thousands of wells that will inject the CO2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelines are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of

  13. The quantum probability calculus

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  14. Choice Probability Generating Functions

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  15. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  16. Probability of satellite collision

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  17. Choice probability generating functions

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  18. Handbook of probability

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  19. Real analysis and probability

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  20. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    Ekonomou, L; Karampelas, P; Vita, V; Chatzarakis, G E

    2011-01-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service

  1. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  2. A simplistic analytical unit cell based model for the effective thermal conductivity of high porosity open-cell metal foams

    Yang, X H; Kuang, J J; Lu, T J; Han, F S; Kim, T

    2013-01-01

    We present a simplistic yet accurate analytical model for the effective thermal conductivity of high porosity open-cell metal foams saturated in a low conducting fluid (air). The model is derived analytically based on a realistic representative unit cell (a tetrakaidecahedron) under the assumption of one-dimensional heat conduction along highly tortuous-conducting ligaments at high porosity ranges (ε ⩾ 0.9). Good agreement with existing experimental data suggests that heat conduction along highly conducting and tortuous ligaments predominantly defines the effective thermal conductivity of open-cell metal foams with negligible conduction in parallel through the fluid phase. (paper)

  3. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  4. Gap formation processes in a high-density plasma opening switch

    Grossmann, J.M.; Swanekamp, S.B.; Ottinger, P.F.; Commisso, R.J.; Hinshelwood, D.D.; Weber, B.V.

    1995-01-01

    A gap opening process in plasma opening switches (POS) is examined with the aid of numerical simulations. In these simulations, a high density (n e =10 14 --5x10 15 cm -3 ) uniform plasma initially bridges a small section of the coaxial transmission line of an inductive energy storage generator. A short section of vacuum transmission line connects the POS to a short circuit load. The results presented here extend previous simulations in the n e =10 12 --10 13 cm -3 density regime. The simulations show that a two-dimensional (2-D) sheath forms in the plasma near a cathode. This sheath is positively charged, and electrostatic sheath potentials that are large compared to the anode--cathode voltage develop. Initially, the 2-D sheath is located at the generator edge of the plasma. As ions are accelerated out of the sheath, it retains its original 2-D structure, but migrates axially toward the load creating a magnetically insulated gap in its wake. When the sheath reaches the load edge of the POS, the POS stops conducting current and the load current increases rapidly. At the end of the conduction phase a gap exists in the POS whose size is determined by the radial dimensions of the 2-D sheath. Simulations at various plasma densities and current levels show that the radial size of the gap scales roughly as B/n e , where B is the magnetic field. The results of this work are discussed in the context of long-conduction-time POS physics, but exhibit the same physical gap formation mechanisms as earlier lower density simulations more relevant to short-conduction-time POS. copyright 1995 American Institute of Physics

  5. Outlook for the use of microsecond plasma opening switches to generate high-power nanosecond current pulses

    Dolgachev, G.I.; Maslennikov, D.D.; Ushakov, A.G.

    2006-01-01

    Paper deals with a phenomenon of current breaking in a conducting plasma volume of plasma opening switchers with a nanosecond time of energy initiation and their application in high-power generators. One determined the conditions to ensure megavolt voltages under the erosion mode making use of external applied magnetic field to ensure magnetic insulation of gap of plasma opening switchers. One studied the peculiar features of application of plasma opening switchers under 5-6 MV voltages to ensure X-ray and gamma-radiation pulses [ru

  6. Correlates of Marijuana Drugged Driving and Openness to Driving While High: Evidence from Colorado and Washington.

    Kevin C Davis

    with lower odds of each of these outcomes (OR = 0.63, P < 0.01, OR = 0.69, P = 0.02, respectively. Post-estimation Wald tests confirmed the negative associations with marijuana DUI were greater in magnitude for safety perceptions than knowledge of DUI laws. Increased perceptions that driving while high is unsafe was associated with significantly lower willingness to drive after using marijuana while increased knowledge of marijuana DUI laws was not associated with these outcomes.Despite recent interventions targeting public awareness of the legal consequences of marijuana DUI, our results suggest that knowledge of these laws is a weaker predictor of DUI behavior than perceptions that driving high is unsafe. In addition, safety perceptions predict decreased openness to driving high while knowledge of DUI laws was not associated with openness. These findings suggest that interventions for reducing the incidence of marijuana DUI are likely to be more successful by targeting safety perceptions related to marijuana DUI rather than knowledge of DUI laws. We caution that because these data are limited to an online convenience sample, results may not be generalizable beyond our sample.

  7. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  8. HiGIS: An Open Framework for High Performance Geographic Information System

    XIONG, W.

    2015-08-01

    Full Text Available Big data era expose many challenges to geospatial data management, geocomputation and cartography. There is no exception in geographic information systems (GIS community. Technologies and facilities of high performance computing (HPC become more and more feasible to researchers, while mobile computing, ubiquitous computing, and cloud computing are emerging. But traditional GIS need to be improved to take advantages of all these evolutions. We proposed and implemented a GIS married with high performance computing, which is called HiGIS. The goal of HiGIS is to promote the performance of geocomputation by leveraging the power of HPC, and to build an open framework for geospatial data storing, processing, displaying and sharing. In this paper the architecture, data model and modules of the HiGIS system are introduced. A geocomputation scheduling engine based on communicating sequential process was designed to exploit spatial analysis and processing. Parallel I/O strategy using file view was proposed to improve the performance of geospatial raster data access. In order to support web-based online mapping, an interactive cartographic script was provided to represent a map. A demostration of locating house was used to manifest the characteristics of HiGIS. Parallel and concurrency performance experiments show the feasibility of this system.

  9. Achieving Energy Savings with Highly-Controlled Lighting in an Open-Plan Office

    Rubinstein, Francis; Enscoe, Abby

    2010-04-19

    An installation in a Federal building tested the effectiveness of a highly-controlled, workstation-specific lighting retrofit. The study took place in an open-office area with 86 cubicles and low levels of daylight. Each cubicle was illuminated by a direct/indirectpendant luminaire with three 32 watt lamps, two dimmable DALI ballasts, and an occupancy sensor. A centralized control system programmed all three lamps to turn on and off according to occupancy on a workstation-by-workstation basis. Field measurements taken over the course of several monthsdemonstrated 40% lighting energy savings compared to a baseline without advanced controls that conforms to GSA's current retrofit standard. A photometric analysis found that the installation provided higher desktop light levels than the baseline, while an occupant survey found that occupants in general preferred the lighting system to thebaseline.Simple payback is fairly high; projects that can achieve lower installation costs and/or higher energy savings and those in which greenhouse gas reduction and occupant satisfaction are significant priorities provide the ideal setting for workstation-specific lighting retrofits.

  10. dc SQUID electronics based on adaptive noise cancellation and a high open-loop gain controller

    Seppae, H.

    1992-01-01

    A low-noise SQUID readout electronics with a high slew rate and an automatic gain control feature has been developed. Flux noise levels of 5x10 -7 Φ 0 /√Hz at 1 kHz and 2x10 -6 Φ 0 /√Hz at 1 Hz have been measured with this readout scheme. The system tolerates sinusoidal disturbances having amplitudes up to 140 Φ 0 at 1 kHz without loosing lock. The electronics utilizes a cooled GaAs FET to control the cancellation of the voltage noise of the room temperature amplifier, a PI 3/2 controller to provide a high open-loop gain at low frequencies, and a square-wave flux and offset voltage modulation to enable automatic control of the noise reduction. The cutoff frequency of the flux-locked-loop is 300 kHz and the feedback gain is more than 130 dB at 10 Hz. (orig.)

  11. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  12. Introduction to probability

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  13. Probability, Nondeterminism and Concurrency

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  14. Unraveling the High Open Circuit Voltage and High Performance of Integrated Perovskite/Organic Bulk-Heterojunction Solar Cells.

    Dong, Shiqi; Liu, Yongsheng; Hong, Ziruo; Yao, Enping; Sun, Pengyu; Meng, Lei; Lin, Yuze; Huang, Jinsong; Li, Gang; Yang, Yang

    2017-08-09

    We have demonstrated high-performance integrated perovskite/bulk-heterojunction (BHJ) solar cells due to the low carrier recombination velocity, high open circuit voltage (V OC ), and increased light absorption ability in near-infrared (NIR) region of integrated devices. In particular, we find that the V OC of the integrated devices is dominated by (or pinned to) the perovskite cells, not the organic photovoltaic cells. A Quasi-Fermi Level Pinning Model was proposed to understand the working mechanism and the origin of the V OC of the integrated perovskite/BHJ solar cell, which following that of the perovskite solar cell and is much higher than that of the low bandgap polymer based organic BHJ solar cell. Evidence for the model was enhanced by examining the charge carrier behavior and photovoltaic behavior of the integrated devices under illumination of monochromatic light-emitting diodes at different characteristic wavelength. This finding shall pave an interesting possibility for integrated photovoltaic devices to harvest low energy photons in NIR region and further improve the current density without sacrificing V OC , thus providing new opportunities and significant implications for future industry applications of this kind of integrated solar cells.

  15. Janus-faced probability

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  16. Does human body odor represent a significant and rewarding social signal to individuals high in social openness?

    Katrin T Lübke

    Full Text Available Across a wide variety of domains, experts differ from novices in their response to stimuli linked to their respective field of expertise. It is currently unknown whether similar patterns can be observed with regard to social expertise. The current study therefore focuses on social openness, a central social skill necessary to initiate social contact. Human body odors were used as social cues, as they inherently signal the presence of another human being. Using functional MRI, hemodynamic brain responses to body odors of women reporting a high (n = 14 or a low (n = 12 level of social openness were compared. Greater activation within the inferior frontal gyrus and the caudate nucleus was observed in high socially open individuals compared to individuals low in social openness. With the inferior frontal gyrus being a crucial part of the human mirror neuron system, and the caudate nucleus being implicated in social reward, it is discussed whether human body odor might constitute more of a significant and rewarding social signal to individuals high in social openness compared to individuals low in social openness process.

  17. Does human body odor represent a significant and rewarding social signal to individuals high in social openness?

    Lübke, Katrin T; Croy, Ilona; Hoenen, Matthias; Gerber, Johannes; Pause, Bettina M; Hummel, Thomas

    2014-01-01

    Across a wide variety of domains, experts differ from novices in their response to stimuli linked to their respective field of expertise. It is currently unknown whether similar patterns can be observed with regard to social expertise. The current study therefore focuses on social openness, a central social skill necessary to initiate social contact. Human body odors were used as social cues, as they inherently signal the presence of another human being. Using functional MRI, hemodynamic brain responses to body odors of women reporting a high (n = 14) or a low (n = 12) level of social openness were compared. Greater activation within the inferior frontal gyrus and the caudate nucleus was observed in high socially open individuals compared to individuals low in social openness. With the inferior frontal gyrus being a crucial part of the human mirror neuron system, and the caudate nucleus being implicated in social reward, it is discussed whether human body odor might constitute more of a significant and rewarding social signal to individuals high in social openness compared to individuals low in social openness process.

  18. BL153 Partially Prevents High-Fat Diet Induced Liver Damage Probably via Inhibition of Lipid Accumulation, Inflammation, and Oxidative Stress

    Jian Wang

    2014-01-01

    Full Text Available The present study was to investigate whether a magnolia extract, named BL153, can prevent obesity-induced liver damage and identify the possible protective mechanism. To this end, obese mice were induced by feeding with high fat diet (HFD, 60% kcal as fat and the age-matched control mice were fed with control diet (10% kcal as fat for 6 months. Simultaneously these mice were treated with or without BL153 daily at 3 dose levels (2.5, 5, and 10 mg/kg by gavage. HFD feeding significantly increased the body weight and the liver weight. Administration of BL153 significantly reduced the liver weight but without effects on body weight. As a critical step of the development of NAFLD, hepatic fibrosis was induced in the mice fed with HFD, shown by upregulating the expression of connective tissue growth factor and transforming growth factor beta 1, which were significantly attenuated by BL153 in a dose-dependent manner. Mechanism study revealed that BL153 significantly suppressed HFD induced hepatic lipid accumulation and oxidative stress and slightly prevented liver inflammation. These results suggest that HFD induced fibrosis in the liver can be prevented partially by BL153, probably due to reduction of hepatic lipid accumulation, inflammation and oxidative stress.

  19. Early weight bearing versus delayed weight bearing in medial opening wedge high tibial osteotomy: a randomized controlled trial.

    Lansdaal, Joris Radboud; Mouton, Tanguy; Wascher, Daniel Charles; Demey, Guillaume; Lustig, Sebastien; Neyret, Philippe; Servien, Elvire

    2017-12-01

    The need for a period of non-weight bearing after medial opening wedge high tibial osteotomy remains controversial. It is hypothesized that immediate weight bearing after medial opening wedge high tibial osteotomy would have no difference in functional scores at one year compared to delayed weight bearing. Fifty patients, median age 54 years (range 40-65), with medial compartment osteoarthritis, underwent a medial opening wedge high tibial osteotomy utilizing a locking plate without bone grafting. Patients were randomized into an Immediate or a Delayed (2 months) weight bearing group. All patients were assessed at one-year follow-up and the two groups compared. The primary outcome measure was the IKS score. Secondary outcome measures included the IKDC score, the VAS pain score and rate of complications. The functional scores significantly improved in both groups. The IKS score increased from 142 ± 31 to 171 ± 26 in the Immediate group (p bearing after medial opening wedge high tibial osteotomy had no effect on functional scores at 1 year follow-up and did not significantly increase the complication rate. Immediate weight bearing after medial opening wedge high tibial osteotomy appears to be safe and can allow some patients a quicker return to activities of daily living and a decreased convalescence period. II.

  20. A sustainable business model for Open-Access journal publishing: a proposed plan for High-Energy Physics

    Jens Vigen

    2008-01-01

    Full Text Available The High Energy Physics community over the last 15 years has achieved so-called full green Open Access through the wide dissemination of preprints via arXiv, a central subject repository managed by Cornell University. However, green Open Access does not alleviate the economic difficulties of libraries as they are still expected to offer access to versions of record of the peer-reviewed literature. For this reason the particle physics community is now addressing the issue of gold Open Access by converting a set of the existing core journals to Open Access. A Working Party has been established to bring together funding agencies, laboratories and libraries into a single consortium, called SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics. This consortium will engage with publishers to build a sustainable model for Open Access publishing. In this model, subscription fees from multiple institutions are replaced by contracts with publishers of Open Access journals, where the SCOAP3 consortium is a single financial partner.

  1. Nickel oxide film with open macropores fabricated by surfactant-assisted anodic deposition for high capacitance supercapacitors.

    Wu, Mao-Sung; Wang, Min-Jyle

    2010-10-07

    Nickel oxide film with open macropores prepared by anodic deposition in the presence of surfactant shows a very high capacitance of 1110 F g(-1) at a scan rate of 10 mV s(-1), and the capacitance value reduces to 950 F g(-1) at a high scan rate of 200 mV s(-1).

  2. Open-Wedge High Tibial Osteotomy: RCT 2 Years RSA Follow-Up.

    Lind-Hansen, Thomas Bruno; Lind, Martin Carøe; Nielsen, Poul Torben; Laursen, Mogens Berg

    2016-11-01

    We investigated the influence of three different bone grafting materials on stability and clinical outcome of the healing open-wedge high tibial osteotomy (OW-HTO) with immediate partial weight bearing. A total of 45 (3 × 15) patients were randomized to injectable calcium phosphate cement (Calcibon; Biomet-Merck Biomaterials GmbH, Darmstadt, Germany), local bone autograft, or iliac crest autograft. Stability of the bony healing was evaluated with radiostereometric analysis (RSA) up to 24 months postoperatively. Clinical outcome was evaluated with the knee injury and osteoarthritis outcome score (KOOS). RSA revealed translations and rotations close to zero regardless of bone grafting material, with no statistically significant differences between the groups. Clinically, the Calcibon group had lower quality of life KOOS subscore at 2 years follow-up. We conclude that with a stable implant and 6 weeks of partial weight bearing, local autografting is sufficient to achieve solid bone consolidation following OW-HTO. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Screensaver: an open source lab information management system (LIMS for high throughput screening facilities

    Nale Jennifer

    2010-05-01

    Full Text Available Abstract Background Shared-usage high throughput screening (HTS facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. Results We have developed Screensaver, a free, open source, web-based lab information management system (LIMS, to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. Conclusions The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities.

  4. Screensaver: an open source lab information management system (LIMS) for high throughput screening facilities.

    Tolopko, Andrew N; Sullivan, John P; Erickson, Sean D; Wrobel, David; Chiang, Su L; Rudnicki, Katrina; Rudnicki, Stewart; Nale, Jennifer; Selfors, Laura M; Greenhouse, Dara; Muhlich, Jeremy L; Shamu, Caroline E

    2010-05-18

    Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities.

  5. A High Precision $3.50 Open Source 3D Printed Rain Gauge Calibrator

    Lopez Alcala, J. M.; Udell, C.; Selker, J. S.

    2017-12-01

    Currently available rain gauge calibrators tend to be designed for specific rain gauges, are expensive, employ low-precision water reservoirs, and do not offer the flexibility needed to test the ever more popular small-aperture rain gauges. The objective of this project was to develop and validate a freely downloadable, open-source, 3D printed rain gauge calibrator that can be adjusted for a wide range of gauges. The proposed calibrator provides for applying low, medium, and high intensity flow, and allows the user to modify the design to conform to unique system specifications based on parametric design, which may be modified and printed using CAD software. To overcome the fact that different 3D printers yield different print qualities, we devised a simple post-printing step that controlled critical dimensions to assure robust performance. Specifically, the three orifices of the calibrator are drilled to reach the three target flow rates. Laboratory tests showed that flow rates were consistent between prints, and between trials of each part, while the total applied water was precisely controlled by the use of a volumetric flask as the reservoir.

  6. Mechanism of formation of subnanosecond current front in high-voltage pulse open discharge

    Schweigert, I. V.; Alexandrov, A. L.; Zakrevsky, Dm. E.; Bokhan, P. A.

    2014-11-01

    The mechanism of subnanosecond current front rise observed previously in the experiment in high-voltage pulse open discharge in helium is studied in kinetic particle-in-cell simulations. The Boltzmann equations for electrons, ions, and fast atoms are solved self-consistently with the Poisson equations for the electrical potential. The partial contributions to the secondary electron emission from the ions, fast atoms, photons, and electrons, bombarding the electrode, are calculated. In simulations, as in the experiment, the discharge glows between two symmetrical cathodes and the anode grid in the midplane at P =6 Torr and the applied voltage of 20 kV. The electron avalanche development is considered for two experimental situations during the last stage of breakdown: (i) with constant voltage and (ii) with decreasing voltage. For case (i), the subnanosecond current front rise is set by photons from the collisional excitation transfer reactions. For the case (ii), the energetic electrons swamp the cathode during voltage drop and provide the secondary electron emission for the subnanosecond current rise, observed in the experiment.

  7. Design of a high order Campbelling mode measurement system using open source hardware

    Izarra, G. de [CEA, DEN,DER, Experimental Programs Laboratory, Cadarache F-13108 Saint-Paul-lez-Durance (France); Elter, Zs. [Chalmers University of Technology, Department of Physics, Division of Subatomic and Plasma Physics, SE-412 96 Göteborg (Sweden); CEA, DEN,DER, Instrumentation, Sensors and Dosimetry Laboratory, Cadarache F-13108 Saint-Paul-lez-Durance (France); Jammes, C. [CEA, DEN,DER, Instrumentation, Sensors and Dosimetry Laboratory, Cadarache F-13108 Saint-Paul-lez-Durance (France)

    2016-12-11

    This paper reviews a new real-time measurement instrument dedicated for online neutron monitoring with fission chambers in nuclear reactors. The instrument implements the higher order Campbelling methods and self-monitoring capabilities on an open source development board. The board includes an CPU/FPGA System on a Chip. The feasibility of the measurement instrument was tested both in laboratory with a signal generator and in the Minerve reactor. It is shown that the instrument provides reliable and robust count rate estimation over a wide reactor power range based on the third order statistics of the fission chamber signal. In addition, the system is able to identify whether the measured count rate change is due to the malfunction of the detector or due to the change in the neutron flux. The applied self-monitoring method is based on the spectral properties of the fission chamber signal. During the experimental verification, the considered malfunction was the change of the polarization voltage. - Highlights: • A new online High Order Campelling measurement system is proposed. • It includes a fission chamber failure detection system. • The complete architecture of the measurement system is given. • Test on reactor show its accuracy over a wide count rate range.

  8. Nontrivial transition of transmission in a highly open quantum point contact in the quantum Hall regime

    Hong, Changki; Park, Jinhong; Chung, Yunchul; Choi, Hyungkook; Umansky, Vladimir

    2017-11-01

    Transmission through a quantum point contact (QPC) in the quantum Hall regime usually exhibits multiple resonances as a function of gate voltage and high nonlinearity in bias. Such behavior is unpredictable and changes sample by sample. Here, we report the observation of a sharp transition of the transmission through an open QPC at finite bias, which was observed consistently for all the tested QPCs. It is found that the bias dependence of the transition can be fitted to the Fermi-Dirac distribution function through universal scaling. The fitted temperature matches quite nicely to the electron temperature measured via shot-noise thermometry. While the origin of the transition is unclear, we propose a phenomenological model based on our experimental results that may help to understand such a sharp transition. Similar transitions are observed in the fractional quantum Hall regime, and it is found that the temperature of the system can be measured by rescaling the quasiparticle energy with the effective charge (e*=e /3 ). We believe that the observed phenomena can be exploited as a tool for measuring the electron temperature of the system and for studying the quasiparticle charges of the fractional quantum Hall states.

  9. Resilience of SAR11 bacteria to rapid acidification in the high-latitude open ocean.

    Hartmann, Manuela; Hill, Polly G; Tynan, Eithne; Achterberg, Eric P; Leakey, Raymond J G; Zubkov, Mikhail V

    2016-02-01

    Ubiquitous SAR11 Alphaproteobacteria numerically dominate marine planktonic communities. Because they are excruciatingly difficult to cultivate, there is comparatively little known about their physiology and metabolic responses to long- and short-term environmental changes. As surface oceans take up anthropogenic, atmospheric CO2, the consequential process of ocean acidification could affect the global biogeochemical significance of SAR11. Shipping accidents or inadvertent release of chemicals from industrial plants can have strong short-term local effects on oceanic SAR11. This study investigated the effect of 2.5-fold acidification of seawater on the metabolism of SAR11 and other heterotrophic bacterioplankton along a natural temperature gradient crossing the North Atlantic Ocean, Norwegian and Greenland Seas. Uptake rates of the amino acid leucine by SAR11 cells as well as other bacterioplankton remained similar to controls despite an instant ∼50% increase in leucine bioavailability upon acidification. This high physiological resilience to acidification even without acclimation, suggests that open ocean dominant bacterioplankton are able to cope even with sudden and therefore more likely with long-term acidification effects. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Differences between opening versus closing high tibial osteotomy on clinical outcomes and gait analysis.

    Deie, Masataka; Hoso, Takayuki; Shimada, Noboru; Iwaki, Daisuke; Nakamae, Atsuo; Adachi, Nobuo; Ochi, Mitsuo

    2014-12-01

    High tibial osteotomy (HTO) for medial knee osteoarthritis (OA) is mainly performed via two procedures: closing wedge HTO (CW) and opening wedge HTO (OW). In this study, differences between these procedures were assessed by serial clinical evaluation and gait analysis before and after surgery. Twenty-one patients underwent HTO for medial knee OA in 2011 and 2012, with 12 patients undergoing CW and nine undergoing OW. The severity of OA was classified according to the Kellgren-Lawrence classification. The Japanese Orthopedic Association score for assessment of knee OA (JOA score), the Numeric Rating Scale (NRS), and the femoral tibial angle (FTA) on X-ray were evaluated. For gait analysis, gait speed, varus moment, varus angle and lateral thrust were calculated. The JOA score and NRS were improved significantly one year postoperatively in both groups. The FTA was maintained in both groups at one year. Varus angle and varus moment were significantly improved in both groups at each postoperative follow-up, when compared preoperatively. Lateral thrust was significantly improved at three months postoperatively in both groups. However, the significant improvement in lateral thrust had disappeared in the CW group six months postoperatively, whereas it was maintained for at least one year in the OW group. This study found that clinical outcomes were well maintained after HTO. OW reduced knee varus moment and lateral thrust, whereas CW had little effect on reducing lateral thrust. Level IV. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Probability and Measure

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. The concept of probability

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  13. Treatment of open tibial fracture with bone defect caused by high velocity missiles: A case report

    Golubović Zoran

    2013-01-01

    Full Text Available Introduction .Tibia fracture caused by high velocity missiles is mostly comminuted and followed by bone defect which makes their healing process extremely difficult and prone to numerous complications. Case Outline. A 34-year-old male was wounded at close range by a semi-automatic gun missile. He was wounded in the distal area of the left tibia and suffered a massive defect of the bone and soft tissue. After the primary treatment of the wound, the fracture was stabilized with an external fixator type Mitkovic, with convergent orientation of the pins. The wound in the medial region of the tibia was closed with the secondary stitch, whereas the wound in the lateral area was closed with the skin transplant after Thiersch. Due to massive bone defect in the area of the rifle-missile wound six months after injury, a medical team placed a reconstructive external skeletal fixator type Mitkovic and performed corticotomy in the proximal metaphyseal area of the tibia. By the method of bone transport (distractive osteogenesis, the bone defect of the tibia was replaced. After the fracture healing seven months from the secondary surgery, the fixator was removed and the patient was referred to physical therapy. Conclusion. Surgical treatment of wounds, external fixation, performing necessary debridement, adequate antibiotic treatment and soft and bone tissue reconstruction are essential in achieving good results in patients with the open tibial fracture with bone defect caused by high velocity missiles. Reconstruction of bone defect can be successfully treated by reconstructive external fixator Mitkovic. [Projekat Ministarstva nauke Republike Srbije, br. III 41017 i br. III 41004

  14. High Rate of Recurrence Following Proximal Medial Opening Wedge Osteotomy for Correction of Moderate Hallux Valgus.

    Iyer, Sravisht; Demetracopoulos, Constantine A; Sofka, Carolyn M; Ellis, Scott J

    2015-07-01

    The proximal medial opening wedge (PMOW) osteotomy has become more popular to treat moderate to severe hallux valgus with the recent development of specifically designed, low-profile modular plates. Despite the promising results previously reported in the literature, we have noted a high incidence of recurrence in patients treated with a PMOW. The purpose of this study was to report the clinical and radiographic outcomes of an initial cohort of patients treated with a PMOW osteotomy for moderate hallux valgus. We retrospectively analyzed prospectively gathered data on a cohort of 17 consecutive patients who were treated by the senior author using a PMOW osteotomy for moderate hallux valgus deformity. Average time to follow-up was 2.4 years (range, 1.0-3.5 years). The intermetatarsal angle (IMA), the hallux valgus angle (HVA), and the distal metatarsal articular angle (DMAA) were assessed on standard weightbearing radiographs of the foot preoperatively and at all follow-up visits. The Foot and Ankle Outcome Score (FAOS) was collected on all patients preoperatively and at final follow-up. Despite demonstrating good correction of their deformity initially, 11 of the 17 patients (64.7%) had evidence of recurrence of their hallux valgus deformity at final follow-up. Patients who recurred had a greater preoperative HVA (P = .023) and DMAA (P = .049) than patients who maintained their correction. Improvement in the quality-of-life subscale of the FAOS was noted at final follow-up for all patients (P = .05). There was no significant improvement in any of the other FAOS subscales. There was a high rate of recurrence of the hallux valgus deformity in this cohort of patients. Recurrence was associated with greater preoperative deformity and an increased preoperative DMAA. The PMOW without a concomitant distal metatarsal osteotomy may be best reserved for patients with mild hallux valgus deformity without an increased DMAA. Level IV, retrospective case series. © The Author

  15. Protective personality traits: High openness and low neuroticism linked to better memory in multiple sclerosis.

    Leavitt, Victoria M; Buyukturkoglu, Korhan; Inglese, Matilde; Sumowski, James F

    2017-11-01

    Memory impairment in multiple sclerosis (MS) is common, although few risk/protective factors are known. To examine relationships of personality to memory/non-memory cognition in MS. 80 patients completed a cognitive battery and a personality scale measuring the "Big 5" traits: openness, neuroticism, agreeableness, extraversion, and conscientiousness. Memory was most related to openness, with higher openness linked to better memory and lower risk for memory impairment, controlling for age, atrophy, education, and intelligence quotient (IQ). Lower neuroticism was also related to better memory, and lower conscientiousness to memory impairment. Non-memory cognition was unrelated to personality. Personality may inform predictive models of memory impairment in MS.

  16. Probability for statisticians

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  17. Concepts of probability theory

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  18. Probability and Bayesian statistics

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  19. Probability and Statistical Inference

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  20. Probabilities in physics

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  1. Probability an introduction

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  2. Probability in physics

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  3. Septal deviation and other factors increase the risk of barotitis media in high altitude high opening training

    Yanuar T. Sastranegara

    2008-03-01

    Full Text Available Barotitis media (BM frequently occurr in High Altitude High Opening (HAHO training simulation as a result from rapid change of pressure. The aim of this study was to investigate septal deviation and other risk factors that increase the risk of BM. This experimental study was conducted at the Indonesian Center for Aviation Medicine and Health (Lakespra Saryanto during May – July 2007 involving Indonesian Armed Forces (TNI HAHO training. Medical examinations were performed before and after training. An otolaryngologist confirm the diagnosis of BM. Cox regression analysis using STATA 9.0 program was performed to identify dominant risk factors for BM. A number of 177 subjects participated in this study. We found 56.5% had BM after training. Septal deviation was found in 28.8% of the subjects and it moderately increased the risk of BM by 23% than normal septum [adjusted relative risk (RRα = 1.23; 95% confidence interval (CI = 0.95 – 1.60; p=0.123]. Those who have been smoking for 1-3 years had 70% increase risk for BM than non-smoking subjects (RRα= 1.68; 95% CI = 1.17 – 2.42. Those who have been in the force for 5 years or longer were 50% more at risk for BM than those who have been in the force less than 5 years. In addition, trainees had 40% higher risk than subjects with special qualifications for HAHO (RRα = 1.40; 95% CI = 0.99 – 1.97; p = 0.051. Special caution need to be applied for those who had septal deviation, longer working period, habit of smoking for 1-3 years, and trainees to minimize the risk of BM. (Med J Indones 2008; 17: 37-42Keywords: barotitis media, septal deviation, HAHO training simulation

  4. Probability in quantum mechanics

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  5. Quantum computing and probability.

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  6. Quantum computing and probability

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  7. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  8. Communicating Low-Probability High-Consequence Risk, Uncertainty and Expert Confidence: Induced Seismicity of Deep Geothermal Energy and Shale Gas.

    Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina

    2018-04-01

    Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.

  9. High-resolution spectroscopic observations of binary stars and yellow stragglers in three open clusters: NGC 2360, NGC 3680, and NGC 5822

    Sales Silva, J. V.; Peña Suárez, V. J.; Katime Santrich, O. J.; Pereira, C. B.; Drake, N. A.; Roig, F., E-mail: joaovictor@on.br, E-mail: jearim@on.br, E-mail: osantrich@on.br, E-mail: claudio@on.br, E-mail: drake@on.br, E-mail: froig@on.br [Observatório Nacional/MCT, Rua Gen. José Cristino, 77, 20921-400 Rio de Janeiro (Brazil)

    2014-11-01

    Binary stars in open clusters are very useful targets in constraining the nucleosynthesis process. The luminosities of the stars are known because the distances of the clusters are also known, so chemical peculiarities can be linked directly to the evolutionary status of a star. In addition, binary stars offer the opportunity to verify a relationship between them and the straggler population in both globular and open clusters. We carried out a detailed spectroscopic analysis to derive the atmospheric parameters for 16 red giants in binary systems and the chemical composition of 11 of them in the open clusters NGC 2360, NGC 3680, and NGC 5822. We obtained abundances of C, N, O, Na, Mg, Al, Ca, Si, Ti, Ni, Cr, Y, Zr, La, Ce, and Nd. The atmospheric parameters of the studied stars and their chemical abundances were determined using high-resolution optical spectroscopy. We employ the local thermodynamic equilibrium model atmospheres of Kurucz and the spectral analysis code MOOG. The abundances of the light elements were derived using the spectral synthesis technique. We found that the stars NGC 2360-92 and 96, NGC 3680-34, and NGC 5822-4 and 312 are yellow straggler stars. We show that the spectra of NGC 5822-4 and 312 present evidence of contamination by an A-type star as a secondary star. For the other yellow stragglers, evidence of contamination is given by the broad wings of the Hα. Detection of yellow straggler stars is important because the observed number can be compared with the number predicted by simulations of binary stellar evolution in open clusters. We also found that the other binary stars are not s-process enriched, which may suggest that in these binaries the secondary star is probably a faint main-sequence object. The lack of any s-process enrichment is very useful in setting constraints for the number of white dwarfs in the open cluster, a subject that is related to the birthrate of these kinds of stars in open clusters and also to the age of a

  10. High-resolution Spectroscopic Observations of Binary Stars and Yellow Stragglers in Three Open Clusters : NGC 2360, NGC 3680, and NGC 5822

    Sales Silva, J. V.; Peña Suárez, V. J.; Katime Santrich, O. J.; Pereira, C. B.; Drake, N. A.; Roig, F.

    2014-11-01

    Binary stars in open clusters are very useful targets in constraining the nucleosynthesis process. The luminosities of the stars are known because the distances of the clusters are also known, so chemical peculiarities can be linked directly to the evolutionary status of a star. In addition, binary stars offer the opportunity to verify a relationship between them and the straggler population in both globular and open clusters. We carried out a detailed spectroscopic analysis to derive the atmospheric parameters for 16 red giants in binary systems and the chemical composition of 11 of them in the open clusters NGC 2360, NGC 3680, and NGC 5822. We obtained abundances of C, N, O, Na, Mg, Al, Ca, Si, Ti, Ni, Cr, Y, Zr, La, Ce, and Nd. The atmospheric parameters of the studied stars and their chemical abundances were determined using high-resolution optical spectroscopy. We employ the local thermodynamic equilibrium model atmospheres of Kurucz and the spectral analysis code MOOG. The abundances of the light elements were derived using the spectral synthesis technique. We found that the stars NGC 2360-92 and 96, NGC 3680-34, and NGC 5822-4 and 312 are yellow straggler stars. We show that the spectra of NGC 5822-4 and 312 present evidence of contamination by an A-type star as a secondary star. For the other yellow stragglers, evidence of contamination is given by the broad wings of the Hα. Detection of yellow straggler stars is important because the observed number can be compared with the number predicted by simulations of binary stellar evolution in open clusters. We also found that the other binary stars are not s-process enriched, which may suggest that in these binaries the secondary star is probably a faint main-sequence object. The lack of any s-process enrichment is very useful in setting constraints for the number of white dwarfs in the open cluster, a subject that is related to the birthrate of these kinds of stars in open clusters and also to the age of a

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. The perception of probability.

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  13. Conditional Probability Modulates Visual Search Efficiency

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  14. High pressure does not counterbalance the advantages of open techniques over closed techniques during heated intraperitoneal chemotherapy with oxaliplatin.

    Facy, Olivier; Combier, Christophe; Poussier, Matthieu; Magnin, Guy; Ladoire, Sylvain; Ghiringhelli, François; Chauffert, B; Rat, Patrick; Ortega-Deballon, Pablo

    2015-01-01

    Heated intraperitoneal chemotherapy (HIPEC) treats residual microscopic disease after cytoreductive surgery. In experimental models, the open HIPEC technique has shown a higher and more homogenous concentration of platinum in the peritoneum than achieved using the closed technique. A 25-cm H2O pressure enhances the penetration of oxaliplatin. Because pressure is easier to set up with the closed technique, high pressure may counterbalance the drawbacks of this technique versus open HIPEC, and a higher pressure may induce a higher penetration. Because higher concentration does not mean deeper penetration, a study of tissues beneath the peritoneum is required. Finally, achieving a deeper penetration (and a higher concentration) raises the question of the passage of drugs through the surgical glove and the surgeon's safety. Four groups of pigs underwent HIPEC with oxaliplatin (150 mg/L) for 30 minutes in open isobaric pressure and pressure at 25 cm H2O, and closed pressure at 25 and 40 cm H2O. Systemic absorption and peritoneal mapping of the concentration of platinum were analyzed, as well as in the retroperitoneal tissue and the surgical gloves. Blood concentrations were higher in open groups. In the parietal surfaces, the concentrations were not different between the isobaric and the closed groups (47.08, 56.39, and 48.57 mg/kg, respectively), but were higher in the open high-pressure group (85.93 mg/kg). In the visceral surfaces, they were lower in the closed groups (3.2 and 3.05 mg/kg) than in the open groups (7.03 and 9.56 mg/kg). Platinum concentrations were similar in the deep retroperitoneal tissue when compared between isobaric and high-pressure procedures. No platin was detected in the internal aspect of the gloves. The use of high pressure during HIPEC does not counterbalance the drawbacks of closed techniques. The tissue concentration of oxaliplatin achieved with the open techniques is higher, even if high pressure is applied during a closed technique

  15. Irreversibility and conditional probability

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  16. The pleasures of probability

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  17. Experimental Probability in Elementary School

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  18. Alternative probability theories for cognitive psychology.

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  19. Improving Ranking Using Quantum Probability

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  20. Choice probability generating functions

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  1. Probability and stochastic modeling

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  2. Collision Probability Analysis

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  3. Beyond the drip-line: a high-resolution open-air Holocene hunter-gatherer sequence from highland Lesotho

    Mitchell, P

    2011-03-01

    Full Text Available the drip-line: a high-resolution open-air Holocene hunter-gatherer sequence from highland Lesotho Peter Mitchell1, Ina Plug2, Geoff Bailey3, Ruth Charles4, Amanda Esterhuysen5, Julia Lee Thorp6, Adrian Parker7 & Stephan Woodborne8 The activities...

  4. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  5. OpenMM 7: Rapid development of high performance algorithms for molecular dynamics.

    Peter Eastman

    2017-07-01

    Full Text Available OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility. It allows users to easily add new features, including forces with novel functional forms, new integration algorithms, and new simulation protocols. Those features automatically work on all supported hardware types (including both CPUs and GPUs and perform well on all of them. In many cases they require minimal coding, just a mathematical description of the desired function. They also require no modification to OpenMM itself and can be distributed independently of OpenMM. This makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community.

  6. High probability of avian influenza virus (H7N7) transmission from poultry to humans active in disease control on infected farms

    M.E.H. Bos (Marian); D.E. te Beest (Dennis); M. van Boven (Michiel); M.R.D.R.B. van Holle; A. Meijer (Adam); A. Bosman (Arnold); Y.M. Mulder (Yonne); M.P.G. Koopmans D.V.M. (Marion); A. Stegeman (Arjan)

    2010-01-01

    textabstractAn epizootic of avian influenza (H7N7) caused a large number of human infections in The Netherlands in 2003. We used data from this epizootic to estimate infection probabilities for persons involved in disease control on infected farms. Analyses were based on databases containing

  7. Estimating Subjective Probabilities

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  8. Classic Problems of Probability

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  9. Identification of High-Variation Fields based on Open Satellite Imagery

    Jeppesen, Jacob Høxbroe; Jacobsen, Rune Hylsberg; Nyholm Jørgensen, Rasmus

    2017-01-01

    . The categorization is based on vegetation indices derived from Sentinel-2 satellite imagery. A case study on 7678 winter wheat fields is presented, which employs open data and open source software to analyze the satellite imagery. Furthermore, the method can be automated to deliver categorizations at every update......This paper proposes a simple method for categorizing fields on a regional level, with respect to intra-field variations. It aims to identify fields where the potential benefits of applying precision agricultural practices are highest from an economic and environmental perspective...

  10. An Objective Theory of Probability (Routledge Revivals)

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  11. Five-Phase Five-Level Open-Winding/Star-Winding Inverter Drive for Low-Voltage/High-Current Applications

    Padmanaban, Sanjeevi Kumar; Blaabjerg, Frede; Wheeler, Patrick

    2016-01-01

    This paper work proposed a five-phase five-level open-/star-winding multilevel AC converter suitable for low-voltage/high-current applications. Modular converter consists of classical two-level five-phase voltage source inverter (VSI) with slight reconfiguration to serve as a multilevel converter...... for open-/star-winding loads. Elaborately, per phase of the VSI is built with one additional bi-directional switch (MOSFET/IGBT) and all five legs links to the neutral through two capacitors. The structure allows multilevel generation to five-level output with greater potential for fault tolerability under...

  12. The open-quotes synergisticclose quotes action of mixed irradiation with high-LET and low-LET radiation

    Suzuki, Shozo

    1994-01-01

    The combined modalities of various agents such as radiation, chemicals and physical agents are often used, and exposure to mixture of agents sometimes occurs in nature. However, it is not clear whether these combined effects are synergistic, partly because definition of the term open-quotes synergismclose quotes is confusing, as pointed out by Streffer and Mueller. It is, of course, desirable that the definition should be simple and widely applicable to all agents. Yet the underlying mechanisms of the effects of different agents are probably different, and the mechanisms of combined effects are different and more complicated than those of a single agent. It is therefore important to define synergism taking each underlying mechanism into consideration. From this viewpoint, the definitions of synergism which have been used to date are examined with respect to the effect of a mixture of different types of radiation on cells, and they are shown to be inappropriate and misleading. This is probably attributable to simply treating the resulting phenomena (cell survival in most cases) without adequately taking into consideration the knowledge of underlying biological mechanisms in defining the synergism that may occur with irradiation. This commentary discusses the inappropriateness of current definitions and proposes a new definition in terms of biological mechanisms as a counterproposal. 16 refs., 6 figs

  13. Multimedia Open Educational Resources in Mathematics for High School Students with Learning Disabilities

    Park, Sanghoon; McLeod, Kenneth

    2018-01-01

    Open Educational Resources (OER) can offer educators the necessary flexibility for tailoring educational resources to better fit their educational goals. Although the number of OER repositories is growing fast, few studies have been conducted to empirically test the effectiveness of OER integration in the classroom. Furthermore, very little is…

  14. nDPI: Open-Source High-Speed Deep Packet Inspection

    Deri, Luca; Martinelli, Maurizio; Bujlow, Tomasz

    2014-01-01

    protocols became increasingly challenging, thus creating a motivation for creating tools and libraries for network protocol classification. This paper covers the design and implementation of nDPI, an open-source library for protocol classification using both packet header and payload. nDPI was extensively...

  15. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  16. Counterexamples in probability

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  17. Epistemology and Probability

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  18. Transition probabilities for atoms

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  19. Early deprivation increases high-leaning behavior, a novel anxiety-like behavior, in the open field test in rats.

    Kuniishi, Hiroshi; Ichisaka, Satoshi; Yamamoto, Miki; Ikubo, Natsuko; Matsuda, Sae; Futora, Eri; Harada, Riho; Ishihara, Kohei; Hata, Yoshio

    2017-10-01

    The open field test is one of the most popular ethological tests to assess anxiety-like behavior in rodents. In the present study, we examined the effect of early deprivation (ED), a model of early life stress, on anxiety-like behavior in rats. In ED animals, we failed to find significant changes in the time spent in the center or thigmotaxis area of the open field, the common indexes of anxiety-like behavior. However, we found a significant increase in high-leaning behavior in which animals lean against the wall standing on their hindlimbs while touching the wall with their forepaws at a high position. The high-leaning behavior was decreased by treatment with an anxiolytic, diazepam, and it was increased under intense illumination as observed in the center activity. In addition, we compared the high-leaning behavior and center activity under various illumination intensities and found that the high-leaning behavior is more sensitive to illumination intensity than the center activity in the particular illumination range. These results suggest that the high-leaning behavior is a novel anxiety-like behavior in the open field test that can complement the center activity to assess the anxiety state of rats. Copyright © 2017 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  20. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  1. Negative probability in the framework of combined probability

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  2. [Investigation of a new highly porous hydroxyapatite matrix for obliterating open mastoid cavities - application in guinea pigs bulla].

    Punke, C; Zehlicke, T; Boltze, C; Pau, H W

    2009-04-01

    Many different techniques for obliterating open mastoid cavity have been described. The results after the application of alloplastic materials like Hydroxyapatite and Tricalciumphosphate were poor due to long-lasting resorption. Extrusion of those materials has been described. We investigated the applicability of a new high-porosity ceramic for obliterating large open mastoid cavities and tested it in an animal model (bulla of guinea pig). A highly porous matrix (NanoBone) bone-inductor fabricated in a sol-gel-technique was administered unilaterally into the opened bullae of 30 guinea pigs. In each animal the opposite bulla was filled with Bio-Oss, a bone substitute consisting of a portion of mineral bovine bone. Histological evaluations were performed 1, 2, 3, 4, 5 and 12 weeks after the implantation. After the initial phase with an inflammatory reaction creating a loose granulation tissue, we observed the formation of trabeculare bone within the fourth week in both groups. From the fifth week on we found osteoclasts on the surface of NanoBone and Bio-Oss with consecutive degradation of both materials. In our animal model study we found beneficial properties of the used bone-inductors NanoBone and Bio-Oss for obliterating open mastoid cavities.

  3. Microwave Interferometry Based On Open-ended Coaxial Technique for High Sensitivity Liquid Sensing

    H. Bakli

    2017-10-01

    Full Text Available This paper describes a modified open-ended coaxial technique for microwave dielectric characterization in liquid media. A calibration model is developed to relate the measured transmission coefficient to the local properties of the sample under test. As a demonstration, the permittivity of different sodium chloride solutions is experimentally determined. Accuracies of 0.17% and 0.19% are obtained respectively for the real and imaginary parts of dielectric permittivity at 5.9 GHz.

  4. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    N. C. Wright

    2018-04-01

    Full Text Available Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  5. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    Wright, Nicholas C.; Polashenski, Chris M.

    2018-04-01

    Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  6. Contributions to quantum probability

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  7. Bayesian Probability Theory

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  8. Contributions to quantum probability

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  9. Waste Package Misload Probability

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  10. Probability theory and applications

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  11. Paradoxes in probability theory

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Measurement uncertainty and probability

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Model uncertainty and probability

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  14. Retrocausality and conditional probability

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  15. Probability via expectation

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  16. Spatial probability aids visual stimulus discrimination

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  17. Experimental demonstration of OpenFlow-enabled media ecosystem architecture for high-end applications over metro and core networks.

    Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra

    2013-02-25

    In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.

  18. Radical covalent organic frameworks: a general strategy to immobilize open-accessible polyradicals for high-performance capacitive energy storage.

    Xu, Fei; Xu, Hong; Chen, Xiong; Wu, Dingcai; Wu, Yang; Liu, Hao; Gu, Cheng; Fu, Ruowen; Jiang, Donglin

    2015-06-01

    Ordered π-columns and open nanochannels found in covalent organic frameworks (COFs) could render them able to store electric energy. However, the synthetic difficulty in achieving redox-active skeletons has thus far restricted their potential for energy storage. A general strategy is presented for converting a conventional COF into an outstanding platform for energy storage through post-synthetic functionalization with organic radicals. The radical frameworks with openly accessible polyradicals immobilized on the pore walls undergo rapid and reversible redox reactions, leading to capacitive energy storage with high capacitance, high-rate kinetics, and robust cycle stability. The results suggest that channel-wall functional engineering with redox-active species will be a facile and versatile strategy to explore COFs for energy storage. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Origin of Reduced Open-Circuit Voltage in Highly Efficient Small-Molecule-Based Solar Cells upon Solvent Vapor Annealing.

    Deng, Wanyuan; Gao, Ke; Yan, Jun; Liang, Quanbin; Xie, Yuan; He, Zhicai; Wu, Hongbin; Peng, Xiaobin; Cao, Yong

    2018-03-07

    In this study, we demonstrate that remarkably reduced open-circuit voltage in highly efficient organic solar cells (OSCs) from a blend of phenyl-C 61 -butyric acid methyl ester and a recently developed conjugated small molecule (DPPEZnP-THD) upon solvent vapor annealing (SVA) is due to two independent sources: increased radiative recombination and increased nonradiative recombination. Through the measurements of electroluminescence due to the emission of the charge-transfer state and photovoltaic external quantum efficiency measurement, we can quantify that the open-circuit voltage losses in a device with SVA due to the radiative recombination and nonradiative recombination are 0.23 and 0.31 V, respectively, which are 0.04 and 0.07 V higher than those of the as-cast device. Despite of the reduced open-circuit voltage, the device with SVA exhibited enhanced dissociation of charge-transfer excitons, leading to an improved short-circuit current density and a remarkable power conversion efficiency (PCE) of 9.41%, one of the best for solution-processed OSCs based on small-molecule donor materials. Our study also clearly shows that removing the nonradiative recombination pathways and/or suppressing energetic disorder in the active layer would result in more long-lived charge carriers and enhanced open-circuit voltage, which are prerequisites for further improving the PCE.

  20. Probability mapping of contaminants

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  1. Probability mapping of contaminants

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  2. Probability of causation approach

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  3. Generalized Probability Functions

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  4. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  5. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  6. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  7. High-resolution Spectroscopic Observations of Single Red Giants in Three Open Clusters: NGC 2360, NGC 3680, and NGC 5822

    Peña Suárez, V. J.; Sales Silva, J. V.; Katime Santrich, O. J.; Drake, N. A.; Pereira, C. B.

    2018-02-01

    Single stars in open clusters with known distances are important targets in constraining the nucleosynthesis process since their ages and luminosities are also known. In this work, we analyze a sample of 29 single red giants of the open clusters NGC 2360, NGC 3680, and NGC 5822 using high-resolution spectroscopy. We obtained atmospheric parameters, abundances of the elements C, N, O, Na, Mg, Al, Ca, Si, Ti, Ni, Cr, Y, Zr, La, Ce, and Nd, as well as radial and rotational velocities. We employed the local thermodynamic equilibrium atmospheric models of Kurucz and the spectral analysis code MOOG. Rotational velocities and light-element abundances were derived using spectral synthesis. Based on our analysis of the single red giants in these three open clusters, we could compare, for the first time, their abundance pattern with that of the binary stars of the same clusters previously studied. Our results show that the abundances of both single and binary stars of the open clusters NGC 2360, NGC 3680, and NGC 5822 do not have significant differences. For the elements created by the s-process, we observed that the open clusters NGC 2360, NGC 3680, and NGC 5822 also follow the trend already raised in the literature that young clusters have higher s-process element abundances than older clusters. Finally, we observed that the three clusters of our sample exhibit a trend in the [Y/Mg]-age relation, which may indicate the ability of the [Y/Mg] ratio to be used as a clock for the giants. Based on the observations made with the 2.2 m telescope at the European Southern Observatory (La Silla, Chile) under an agreement with Observatório Nacional and under an agreement between Observatório Nacional and Max-Planck Institute für Astronomie.

  8. Symmetry-Breaking Charge Transfer in a Zinc Chlorodipyrrin Acceptor for High Open Circuit Voltage Organic Photovoltaics

    Bartynski, Andrew N.

    2015-04-29

    © 2015 American Chemical Society. Low open-circuit voltages significantly limit the power conversion efficiency of organic photovoltaic devices. Typical strategies to enhance the open-circuit voltage involve tuning the HOMO and LUMO positions of the donor (D) and acceptor (A), respectively, to increase the interfacial energy gap or to tailor the donor or acceptor structure at the D/A interface. Here, we present an alternative approach to improve the open-circuit voltage through the use of a zinc chlorodipyrrin, ZCl [bis(dodecachloro-5-mesityldipyrrinato)zinc], as an acceptor, which undergoes symmetry-breaking charge transfer (CT) at the donor/acceptor interface. DBP/ZCl cells exhibit open-circuit voltages of 1.33 V compared to 0.88 V for analogous tetraphenyldibenzoperyflanthrene (DBP)/C60-based devices. Charge transfer state energies measured by Fourier-transform photocurrent spectroscopy and electroluminescence show that C60 forms a CT state of 1.45 ± 0.05 eV in a DBP/C60-based organic photovoltaic device, while ZCl as acceptor gives a CT state energy of 1.70 ± 0.05 eV in the corresponding device structure. In the ZCl device this results in an energetic loss between ECT and qVOC of 0.37 eV, substantially less than the 0.6 eV typically observed for organic systems and equal to the recombination losses seen in high-efficiency Si and GaAs devices. The substantial increase in open-circuit voltage and reduction in recombination losses for devices utilizing ZCl demonstrate the great promise of symmetry-breaking charge transfer in organic photovoltaic devices.

  9. Winter School on Operator Spaces, Noncommutative Probability and Quantum Groups

    2017-01-01

    Providing an introduction to current research topics in functional analysis and its applications to quantum physics, this book presents three lectures surveying recent progress and open problems.  A special focus is given to the role of symmetry in non-commutative probability, in the theory of quantum groups, and in quantum physics. The first lecture presents the close connection between distributional symmetries and independence properties. The second introduces many structures (graphs, C*-algebras, discrete groups) whose quantum symmetries are much richer than their classical symmetry groups, and describes the associated quantum symmetry groups. The last lecture shows how functional analytic and geometric ideas can be used to detect and to quantify entanglement in high dimensions.  The book will allow graduate students and young researchers to gain a better understanding of free probability, the theory of compact quantum groups, and applications of the theory of Banach spaces to quantum information. The l...

  10. Blasting methods for heterogeneous rocks in hillside open-pit mines with high and steep slopes

    Chen, Y. J.; Chang, Z. G.; Chao, X. H.; Zhao, J. F.

    2017-06-01

    In the arid desert areas in Xinjiang, most limestone quarries are hillside open-pit mines (OPMs) where the limestone is hard, heterogeneous, and fractured, and can be easily broken into large blocks by blasting. This study tried to find effective technical methods for blasting heterogeneous rocks in such quarries based on an investigation into existing problems encountered in actual mining at Hongshun Limestone Quarry in Xinjiang. This study provided blasting schemes for hillside OPMs with different heights and slopes. These schemes involve the use of vertical deep holes, oblique shallow holes, and downslope hole-by-hole sublevel or simultaneous detonation techniques. In each bench, the detonations of holes in a detonation unit occur at intervals of 25-50 milliseconds. The research findings can offer technical guidance on how to blast heterogeneous rocks in hillside limestone quarries.

  11. Highly conductive, transparent flexible films based on open rings of multi-walled carbon nanotubes

    Ko, Wen-Yin; Su, Jun-Wei; Guo, Chian-Hua; Fu, Shu-Juan; Hsu, Chuen-Yuan; Lin, Kuan-Jiuh

    2011-01-01

    Open rings of multi-walled carbon nanotubes were stacked to form porous networks on a poly(ethylene terephthalate) substrate to form a flexible conducting film (MWCNT-PET) with good electrical conductivity and transparency by a combination of ultrasonic atomization and spin-coating technique. To enhance the electric flexibility, we spin-coated a cast film of poly(vinyl alcohol) onto the MWCNT-PET substrate, which then underwent a thermo-compression process. Field-emission scanning electron microscopy of the cross-sectional morphology illustrates that the film has a robust network with a thickness of ∼ 175 nm, and it remarkably exhibits a sheet resistance of approximately 370 Ω/sq with ∼ 77% transmittance at 550 nm even after 500 bending cycles. This electrical conductivity is much superior to that of other MWCNT-based transparent flexible films.

  12. Stacking Orientation Mediation of Pentacene and Derivatives for High Open-Circuit Voltage Organic Solar Cells.

    Chou, Chi-Ta; Lin, Chien-Hung; Tai, Yian; Liu, Chin-Hsin J; Chen, Li-Chyong; Chen, Kuei-Hsien

    2012-05-03

    In this Letter, we investigated the effect of the molecular stacking orientation on the open circuit voltage (VOC) of pentacene-based organic solar cells. Two functionalized pentacenes, namely, 6,13-diphenyl-pentacene (DP-penta) and 6,13-dibiphenyl-4-yl-pentacene (DB-penta), were utilized. Different molecular stacking orientations of the pentacene derivatives from the pristine pentacene were identified by angle-dependent near-edge X-ray absorption fine structure measurements. It is concluded that pentacene molecules stand up on the substrate surface, while both functionalized pentacenes lie down. A significant increase of the VOC from 0.28 to 0.83 V can be achieved upon the utilization of functionalized pentacene, owing to the modulation of molecular stacking orientation, which induced a vacuum-level shift.

  13. Striatal activity is modulated by target probability.

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  14. How partnership accelerates Open Science: High Energy Physics and INSPIRE, a case study of a complex repository ecosystem

    AUTHOR|(CDS)2079501; Hecker, Bernard Louis; Holtkamp, Annette; Mele, Salvatore; O'Connell, Heath; Sachs, Kirsten; Simko, Tibor; Schwander, Thorsten

    2013-01-01

    Public calls, agency mandates and scientist demand for Open Science are by now a reality with different nuances across diverse research communities. A complex “ecosystem” of services and tools, mostly communityDdriven, will underpin this revolution in science. Repositories stand to accelerate this process, as “openness” evolves beyond text, in lockstep with scholarly communication. We present a case study of a global discipline, HighDEnergy Physics (HEP), where most of these transitions have already taken place in a “social laboratory” of multiple global information services interlinked in a complex, but successful, ecosystem at the service of scientists. We discuss our firstDhand experience, at a technical and organizational level, of leveraging partnership across repositories and with the user community in support of Open Science, along threads relevant to the OR2013 community.

  15. Next Generation Space Interconnect Standard (NGSIS): a modular open standards approach for high performance interconnects for space

    Collier, Charles Patrick

    2017-04-01

    The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.

  16. Probable maximum flood control

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  17. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  18. A hydroclimatological approach to predicting regional landslide probability using Landlab

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  19. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  20. High-latitude convection on open and closed field lines for large IMF B(y)

    Moses, J. J.; Crooker, N. U.; Gorney, D. J.; Siscoe, G. L.

    1985-01-01

    S3-3 electric field observations for August 23, 1976, show a single convection cell engulfing the northern polar cap. The flow direction is that for a positive IMF B(y) component. The particle data indicate that nearly half the duskside sunward flow occurs on closed field lines whereas the dawnside flow is entirely on open field lines. This is interpreted in terms of an IMF B(y)-induced deformation in the polar cap boundary, where the deformation moves with the convective flow. Thus, convection streamlines cross the deformed polar cap boundary, but no flow crosses the boundary because it is carried by the flow. Since southern hemisphere convection is expected to occur with the opposite sense of rotation, closed field lines that will be forced to tilt azimuthally are predicted. On the nightside the tilt produces a y component of the magnetic field in the same direction as the IMF for either sign of IMF B(y). This interpretation is consistent with observations of a greater y component in the plasma sheet than the tail lobes, which are difficult to understand in terms of the common explanation of IMF penetration. Alternatives to this interpretation are also discussed.

  1. High prevalence of anxiety and depression in patients with primary open-angle glaucoma.

    Mabuchi, Fumihiko; Yoshimura, Kimio; Kashiwagi, Kenji; Shioe, Kunihiko; Yamagata, Zentaro; Kanba, Shigenobu; Iijima, Hiroyuki; Tsukahara, Shigeo

    2008-01-01

    To assess anxiety and depression in patients with primary open-angle glaucoma (POAG). Multicenter prospective case-control study. Two hundred thirty patients with POAG and 230 sex-matched and age-matched reference subjects with no chronic ocular conditions except cataracts. Anxiety and depression were evaluated using Hospital Anxiety and Depression Scale (HADS) questionnaire, which consists of 2 subscales with ranges of 0 to 21, representing anxiety (HADS-A) and depression (HADS-D). The prevalence of POAG patients with anxiety (a score of more than 10 on the HADS-A) or depression (a score of more than 10 on the HADS-D) was compared with that in the reference subjects. The prevalence of patients with depression was compared between the POAG patients with and without current beta-blocker eye drops. The prevalence (13.0%) of POAG patients with anxiety was significantly higher (P=0.030) than in the reference subjects (7.0%). The prevalence (10.9%) of POAG patients with depression was significantly higher (P=0.026) than in the reference subjects (5.2%). Between the POAG patients with and without beta-blocker eye-drops, no significant difference (P=0.93) in the prevalence of depression was noted. POAG was related to anxiety and depression. No significant relationship between the use of beta-blocker eye-drops and depression was noted.

  2. Simulation in CFD of a Pebble Bed: Advanced high temperature reactor core using OpenFOAM

    Dahl, Pamela M.; Su, Jian

    2017-01-01

    Numerical simulations of a Pebble Bed nuclear reactor core are presented using the multi-physics tool-kit OpenFOAM. The HTR-PM is modeled using the porous media approach, accounting both for viscous and inertial effects through the Darcy and Forchheimer model. Initially, cylindrical 2D and 3D simulations are compared, in order to evaluate their differences and decide if the 2D simulations carry enough of the sought information, considering the savings in computational costs. The porous medium is considered to be isotropic, with the whole length of the packed bed occupied homogeneously with the spherical fuel elements. Steady-state simulations for normal equilibrium operation are performed, using a semi sine function of the power density along the vertical axis as the source term for the energy balance equation.Total pressure drop is calculated and compared with that obtained from literature for a similar case. At a second stage, transient simulations are performed, where relevant parameters are calculated and compared to those of the literature. (author)

  3. Simulation in CFD of a Pebble Bed: Advanced high temperature reactor core using OpenFOAM

    Dahl, Pamela M.; Su, Jian, E-mail: sujian@nuclear.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2017-07-01

    Numerical simulations of a Pebble Bed nuclear reactor core are presented using the multi-physics tool-kit OpenFOAM. The HTR-PM is modeled using the porous media approach, accounting both for viscous and inertial effects through the Darcy and Forchheimer model. Initially, cylindrical 2D and 3D simulations are compared, in order to evaluate their differences and decide if the 2D simulations carry enough of the sought information, considering the savings in computational costs. The porous medium is considered to be isotropic, with the whole length of the packed bed occupied homogeneously with the spherical fuel elements. Steady-state simulations for normal equilibrium operation are performed, using a semi sine function of the power density along the vertical axis as the source term for the energy balance equation.Total pressure drop is calculated and compared with that obtained from literature for a similar case. At a second stage, transient simulations are performed, where relevant parameters are calculated and compared to those of the literature. (author)

  4. Probability concepts in quality risk management.

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  5. Plasma erosion opening switch in the double-pulse operation mode of a high-current electron accelerator

    Isakov, I.F.; Lopatin, V.S.; Remnev, G.E.

    1987-01-01

    This paper reports the results of investigations of the operation of a fast current opening switch, with a 10/sup 13/-10/sup 16/ plasma density produced either by dielectric surface flashover or by explosive emission of graphite. A series of two pulses was applied to two diodes in parallel. The first pulse produced plasma in the first diode which closed that diode gap by the arrival time of the second pulse. The first, shorted, diode then acted as an erosion switch for the second pulse. A factor of 2.5-3 power multiplication was obtained under optimum conditions. The opening-switch resistance during the magnetic insulation phase, neglecting the electron losses between the switch and the generating diode, exceeded 100 Ω. The duration of the rapid opening phase was less than 5 ns under optimum conditions. This method of plasma production does not require external plasma sources, and permits a wide variation of plasma density, which in turn allows high inductor currents and stored energies

  6. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  7. Opening Lab Doors to High School Students: Keys to a Successful Engagement

    Slayton, Rebecca M.; Nelson, Keith A.

    2005-01-01

    A project to invite high school students into research laboratories to plan and carry out an investigation over several weeks, using the sophisticated equipment available there, can help to break down social barriers and enhance outreach activities.

  8. Towards Open Access Publishing in High Energy Physics Report of the SCOAP3 Working Party

    Bianco, S; Ferreira, P; Friend, F; Gargiulo, P; Hanania, R; Henrot-Versillé, S; Holtkamp, A; Igo-Kemenes, P; Jarroux-Declais, D; Jordão, M; Kämper, B-C; Krause, J; Lagrange, T; Le Diberder, F R; Lemasurier, A; Lengenfelder, A; Lindqvist, C M; Mele, S; Plaszczynski, S; Schimmer, R; Vigen, Jens; Voss, R; Wilbers, M; Yeomans, J; Zioutas, K

    2007-01-01

    This Report concerns the implementation of a process today supported by leading actors from the particle physics community, and worked through in detail by members of an international Working Party. The initiative offers an opportunity for the cost-effective dissemination of high-quality research articles in particle physics, enabling use of the new technologies of e-Science across the literature of High Energy physics.

  9. Probability and rational choice

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  10. YCl3-Catalyzed Highly Selective Ring Opening of Epoxides by Amines at Room Temperature and under Solvent-Free Conditions

    Wuttichai Natongchai

    2017-11-01

    Full Text Available A simple, efficient, and environmentally benign approach for the synthesis of β-amino alcohols is herein described. YCl3 efficiently carried out the ring opening of epoxides by amines to produce β-amino alcohols under solvent-free conditions at room temperature. This catalytic approach is very effective, with several aromatic and aliphatic oxiranes and amines. A mere 1 mol % concentration of YCl3 is enough to deliver β-amino alcohols in good to excellent yields with high regioselectivity.

  11. Highly open bowl-like PtAuAg nanocages as robust electrocatalysts towards ethylene glycol oxidation

    Xu, Hui; Yan, Bo; Li, Shumin; Wang, Jin; Song, Pingping; Wang, Caiqin; Guo, Jun; Du, Yukou

    2018-04-01

    A novel combined seed mediated and galvanic replacement method has been demonstrated to synthesize a new class of trimetallic PtAuAg nanocatalysts with highly open bowl-like nanocage structure. The newly-generated PtAuAg nanocages catalysts exhibit superior electrocatalytic performances towards ethylene glycol oxidation with the mass activity of 6357.1 mA mg-1, 5.5 times higher than that of commercial Pt/C (1151.1 mA mg-1). This work demonstrates the first example of designing shape-controlled architectures of trimetallic bowl-like PtAuAg nanocages for liquid fuel electrooxidation.

  12. Cholesterol modulates open probability and desensitization of NMDA receptors

    Kořínek, Miloslav; Vyklický, Vojtěch; Borovská, Jiřina; Lichnerová, Katarina; Kaniaková, Martina; Krausová, Barbora; Krůšek, Jan; Balík, Aleš; Smejkalová, Tereza; Horák, Martin; Vyklický ml., Ladislav

    2015-01-01

    Roč. 593, č. 10 (2015), s. 2279-2293 ISSN 0022-3751 R&D Projects: GA ČR(CZ) GPP303/11/P391; GA ČR(CZ) GAP303/12/1464; GA ČR(CZ) GBP304/12/G069; GA ČR(CZ) GA14-02219S; GA ČR(CZ) GP14-09220P; GA TA ČR(CZ) TE01020028; GA MŠk(CZ) ED1.1.00/02.0109 Institutional support: RVO:67985823 Keywords : NMDA receptor * glutamate-gated * cholesterol Subject RIV: ED - Physiology Impact factor: 4.731, year: 2015

  13. A high-order doubly asymptotic open boundary for scalar waves in semi-infinite layered systems

    Prempramote, S; Song, Ch; Birk, C

    2010-01-01

    Wave propagation in semi-infinite layered systems is of interest in earthquake engineering, acoustics, electromagnetism, etc. The numerical modelling of this problem is particularly challenging as evanescent waves exist below the cut-off frequency. Most of the high-order transmitting boundaries are unable to model the evanescent waves. As a result, spurious reflection occurs at late time. In this paper, a high-order doubly asymptotic open boundary is developed for scalar waves propagating in semi-infinite layered systems. It is derived from the equation of dynamic stiffness matrix obtained in the scaled boundary finite-element method in the frequency domain. A continued-fraction solution of the dynamic stiffness matrix is determined recursively by satisfying the scaled boundary finite-element equation at both high- and low-frequency limits. In the time domain, the continued-fraction solution permits the force-displacement relationship to be formulated as a system of first-order ordinary differential equations. Standard time-step schemes in structural dynamics can be directly applied to evaluate the response history. Examples of a semi-infinite homogeneous layer and a semi-infinite two-layered system are investigated herein. The displacement results obtained from the open boundary converge rapidly as the order of continued fractions increases. Accurate results are obtained at early time and late time.

  14. Brittle materials at high-loading rates: an open area of research

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956517

  15. Brittle materials at high-loading rates: an open area of research

    Forquin, Pascal

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  16. Open questions in the magnetic behaviour of high-temperature superconductors

    Cohen, L.F.; Jensen, Henrik Jeldtoft

    1997-01-01

    A principally experimental review of vortex behaviour in high-temperature superconductors is presented. The reader is first introduced to the basic concepts needed to understand the magnetic properties of type II superconductors. The concepts of vortex melting, the vortex glass, vortex creep, etc are also discussed briefly. The bulk part of the review relates the theoretical predictions proposed for the vortex system in high temperature superconductors to experimental findings. The review ends with an attempt to direct the reader to those areas which still require further clarification. (author)

  17. COVAL, Compound Probability Distribution for Function of Probability Distribution

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  18. A CAD Open Platform for High Performance Reconfigurable Systems in the EXTRA Project

    Rabozzi, M.; Brondolin, R.; Natale, G.; Del Sozzo, E.; Huebner, M.; Brokalakis, A.; Ciobanu, C.; Stroobandt, D.; Santambrogio, M.D.; Hübner, M.; Reis, R.; Stan, M.; Voros, N.

    2017-01-01

    As the power wall has become one of the main limiting factors for the performance of general purpose processors, the trend in High Performance Computing (HPC) is moving towards application-specific accelerators in order to meet the stringent performance requirements for exascale computing while

  19. High-throughput open source computational methods for genetics and genomics

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  20. High CO2 Primes Plant Biotic Stress Defences through Redox-Linked Pathways1[OPEN

    2016-01-01

    Industrial activities have caused tropospheric CO2 concentrations to increase over the last two centuries, a trend that is predicted to continue for at least the next several decades. Here, we report that growth of plants in a CO2-enriched environment activates responses that are central to defense against pathogenic attack. Salicylic acid accumulation was triggered by high-growth CO2 in Arabidopsis (Arabidopsis thaliana) and other plants such as bean (Phaseolus vulgaris). A detailed analysis in Arabidopsis revealed that elevated CO2 primes multiple defense pathways, leading to increased resistance to bacterial and fungal challenge. Analysis of gene-specific mutants provided no evidence that activation of plant defense pathways by high CO2 was caused by stomatal closure. Rather, the activation is partly linked to metabolic effects involving redox signaling. In support of this, genetic modification of redox components (glutathione contents and NADPH-generating enzymes) prevents full priming of the salicylic acid pathway and associated resistance by high CO2. The data point to a particularly influential role for the nonphosphorylating glyceraldehyde-3-phosphate dehydrogenase, a cytosolic enzyme whose role in plants remains unclear. Our observations add new information on relationships between high CO2 and oxidative signaling and provide novel insight into plant stress responses in conditions of increased CO2. PMID:27578552

  1. Clinical potentials of methylator phenotype in stage 4 high-risk neuroblastoma: an open challenge.

    Barbara Banelli

    Full Text Available Approximately 20% of stage 4 high-risk neuroblastoma patients are alive and disease-free 5 years after disease onset while the remaining experience rapid and fatal progression. Numerous findings underline the prognostic role of methylation of defined target genes in neuroblastoma without taking into account the clinical and biological heterogeneity of this disease. In this report we have investigated the methylation of the PCDHB cluster, the most informative member of the "Methylator Phenotype" in neuroblastoma, hypothesizing that if this epigenetic mark can predict overall and progression free survival in high-risk stage 4 neuroblastoma, it could be utilized to improve the risk stratification of the patients, alone or in conjunction with the previously identified methylation of the SFN gene (14.3.3sigma that can accurately predict outcome in these patients. We have utilized univariate and multivariate models to compare the prognostic power of PCDHB methylation in terms of overall and progression free survival, quantitatively determined by pyrosequencing, with that of other markers utilized for the patients' stratification utilizing methylation thresholds calculated on neuroblastoma at stage 1-4 and only on stage 4, high-risk patients. Our results indicate that PCDHB accurately distinguishes between high- and intermediate/low risk stage 4 neuroblastoma in agreement with the established risk stratification criteria. However PCDHB cannot predict outcome in the subgroup of stage 4 patients at high-risk whereas methylation levels of SFN are suggestive of a "methylation gradient" associated with tumor aggressiveness as suggested by the finding of a higher threshold that defines a subset of patients with an extremely severe disease (OS <24 months. Because of the heterogeneity of neuroblastoma we believe that clinically relevant methylation markers should be selected and tested on homogeneous groups of patients rather than on patients at all stages.

  2. Improvement of the knee center of rotation during walking after opening wedge high tibial osteotomy.

    Kim, Kyungsoo; Feng, Jun; Nha, Kyung Wook; Park, Won Man; Kim, Yoon Hyuk

    2015-06-01

    Accurate measurement of the center of rotation of the knee joint is indispensable for prediction of joint kinematics and kinetics in musculoskeletal models. However, no study has yet identified the knee center of rotations during several daily activities before and after high tibial osteotomy surgery, which is one surgical option for treating knee osteoarthritis. In this study, an estimation method for determining the knee joint center of rotation was developed by applying the optimal common shape technique and symmetrical axis of rotation approach techniques to motion-capture data and validated for typical activities (walking, squatting, climbing up stairs, walking down stairs) of 10 normal subjects. The locations of knee joint center of rotations for injured and contralateral knees of eight subjects with osteoarthritis, both before and after high tibial osteotomy surgery, were then calculated during walking. It was shown that high tibial osteotomy surgery improved the knee joint center of rotation since the center of rotations for the injured knee after high tibial osteotomy surgery were significantly closer to those of the normal healthy population. The difference between the injured and contralateral knees was also generally reduced after surgery, demonstrating increased symmetry. These results indicate that symmetry in both knees can be recovered in many cases after high tibial osteotomy surgery. Moreover, the recovery of center of rotation in the injured knee was prior to that of symmetry. This study has the potential to provide fundamental information that can be applied to understand abnormal kinematics in patients, diagnose knee joint disease, and design a novel implants for knee joint surgeries. © IMechE 2015.

  3. High Temperature Thermoplastic Additive Manufacturing Using Low-Cost, Open-Source Hardware

    Gardner, John M.; Stelter, Christopher J.; Yashin, Edward A.; Siochi, Emilie J.

    2016-01-01

    Additive manufacturing (or 3D printing) via Fused Filament Fabrication (FFF), also known as Fused Deposition Modeling (FDM), is a process where material is placed in specific locations layer-by-layer to create a complete part. Printers designed for FFF build parts by extruding a thermoplastic filament from a nozzle in a predetermined path. Originally developed for commercial printers, 3D printing via FFF has become accessible to a much larger community of users since the introduction of Reprap printers. These low-cost, desktop machines are typically used to print prototype parts or novelty items. As the adoption of desktop sized 3D printers broadens, there is increased demand for these machines to produce functional parts that can withstand harsher conditions such as high temperature and mechanical loads. Materials meeting these requirements tend to possess better mechanical properties and higher glass transition temperatures (Tg), thus requiring printers with high temperature printing capability. This report outlines the problems and solutions, and includes a detailed description of the machine design, printing parameters, and processes specific to high temperature thermoplastic 3D printing.

  4. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  5. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  6. An experimental and simulation study of novel channel designs for open-cathode high-temperature polymer electrolyte membrane fuel cells

    Thomas, Sobi; Bates, Alex; Park, Sam

    2016-01-01

    A minimum balance of plant (BOP) is desired for an open-cathode high temperature polymer electrolyte membrane (HTPEM) fuel cell to ensure low parasitic losses and a compact design. The advantage of an open-cathode system is the elimination of the coolant plate and incorporation of a blower for ox...

  7. High floral bud abscission and lack of open flower abscission in Dendrobium cv. Miss Teen: rapid reduction of ethylene sensitivity in the abscission zone

    Bunya-atichart, K.; Ketsa, S.; Doorn, van W.G.

    2006-01-01

    We studied the abscission of floral buds and open flowers in cut Dendrobium inflorescences. Abscission of floral buds was high and sensitive to ethylene in all cultivars studied. Many open flowers abscised in most cultivars, but cv. Willie exhibited only small amount of floral fall and cv. Miss Teen

  8. Assessing the clinical probability of pulmonary embolism

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  9. Development of Thin Film Amorphous Silicon Tandem Junction Based Photocathodes Providing High Open-Circuit Voltages for Hydrogen Production

    F. Urbain

    2014-01-01

    Full Text Available Hydrogenated amorphous silicon thin film tandem solar cells (a-Si:H/a-Si:H have been developed with focus on high open-circuit voltages for the direct application as photocathodes in photoelectrochemical water splitting devices. By temperature variation during deposition of the intrinsic a-Si:H absorber layers the band gap energy of a-Si:H absorber layers, correlating with the hydrogen content of the material, can be adjusted and combined in a way that a-Si:H/a-Si:H tandem solar cells provide open-circuit voltages up to 1.87 V. The applicability of the tandem solar cells as photocathodes was investigated in a photoelectrochemical cell (PEC measurement set-up. With platinum as a catalyst, the a-Si:H/a-Si:H based photocathodes exhibit a high photocurrent onset potential of 1.76 V versus the reversible hydrogen electrode (RHE and a photocurrent of 5.3 mA/cm2 at 0 V versus RHE (under halogen lamp illumination. Our results provide evidence that a direct application of thin film silicon based photocathodes fulfills the main thermodynamic requirements to generate hydrogen. Furthermore, the presented approach may provide an efficient and low-cost route to solar hydrogen production.

  10. Identification of novel KCNQ4 openers by a high-throughput fluorescence-based thallium flux assay.

    Li, Qunyi; Rottländer, Mario; Xu, Mingkai; Christoffersen, Claus Tornby; Frederiksen, Kristen; Wang, Ming-Wei; Jensen, Henrik Sindal

    2011-11-01

    To develop a real-time thallium flux assay for high-throughput screening (HTS) of human KCNQ4 (Kv7.4) potassium channel openers, we used CHO-K1 cells stably expressing human KCNQ4 channel protein and a thallium-sensitive dye based on the permeability of thallium through potassium channels. The electrophysiological and pharmacological properties of the cell line expressing the KCNQ4 protein were found to be in agreement with that reported elsewhere. The EC(50) values of the positive control compound (retigabine) determined by the thallium and (86)rubidium flux assays were comparable to and consistent with those documented in the literature. Signal-to-background (S/B) ratio and Z factor of the thallium influx assay system were assessed to be 8.82 and 0.63, respectively. In a large-scale screening of 98,960 synthetic and natural compounds using the thallium influx assay, 76 compounds displayed consistent KCNQ4 activation, and of these 6 compounds demonstrated EC(50) values of less than 20 μmol/L and 2 demonstrated EC(50) values of less than 1 μmol/L. Taken together, the fluorescence-based thallium flux assay is a highly efficient, automatable, and robust tool to screen potential KCNQ4 openers. This approach may also be expanded to identify and evaluate potential modulators of other potassium channels. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Yesterday's Students in Today's World—Open and Guided Inquiry Through the Eyes of Graduated High School Biology Students

    Dorfman, Bat-Shahar; Issachar, Hagit; Zion, Michal

    2017-12-01

    Educational policy bodies worldwide have argued that practicing inquiry as a part of the K-12 curriculum would help prepare students for their lives as adults in today's world. This study investigated adults who graduated high school 9 years earlier with a major in biology, to determine how they perceive the inquiry project they experienced and its contribution to their lives. We characterized dynamic inquiry performances and the retrospective perceptions of the inquiry project. Data was collected by interviews with 17 individuals—nine who engaged in open inquiry and eight who engaged in guided inquiry in high school. Both groups shared similar expressions of the affective point of view and procedural understanding criteria of dynamic inquiry, but the groups differed in the expression of the criteria changes occurring during inquiry and learning as a process. Participants from both groups described the contribution of the projects to their lives as adults, developing skills and positive attitudes towards science and remembering the content knowledge and activities in which they were involved. They also described the support they received from their teachers. Results of this study imply that inquiry, and particularly open inquiry, helps develop valuable skills and personal attributes, which may help the students in their lives as future adults. This retrospective point of view may contribute to a deeper understanding of the long-term influences of inquiry-based learning on students.

  12. A Tale of Two Probabilities

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  13. A game with rules in the making - how the high probability of waiting games in nanomedicine is being mitigated through distributed regulation and responsible innovation

    D'Silva, J.J.F.; Robinson, D.K.R.; Shelley Egan, Clare

    2012-01-01

    The potential benefits of nanotechnologies in healthcare are widely expected to be enormous and a considerable amount of investment is already pouring into public research in this area. These high expectations of benefits are coupled with uncertainty surrounding the potential risks of the

  14. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Yin, Bing; Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li; Wen, Zhen-Yi; Jiang, Zhen-Yi

    2014-03-01

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg2(CN)5]- clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN-1 were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  15. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Yin, Bing; Wen, Zhen-Yi; Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li; Jiang, Zhen-Yi

    2014-01-01

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg 2 (CN) 5 ] − clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN −1 were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties

  16. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Yin, Bing, E-mail: rayinyin@gmail.com; Wen, Zhen-Yi [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Institute of Modern Physics, Northwest University, Xi' an 710069 (China); Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Jiang, Zhen-Yi [Institute of Modern Physics, Northwest University, Xi' an 710069 (China)

    2014-03-07

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg{sub 2}(CN){sub 5}]{sup −} clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN{sup −1} were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  17. Introduction to probability with R

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. Improvements of high-power diode laser line generators open up new application fields

    Meinschien, J.; Bayer, A.; Bruns, P.; Aschke, L.; Lissotschenko, V. N.

    2009-02-01

    Beam shaping improvements of line generators based on high power diode lasers lead to new application fields as hardening, annealing or cutting of various materials. Of special interest is the laser treatment of silicon. An overview of the wide variety of applications is presented with special emphasis of the relevance of unique laser beam parameters like power density and beam uniformity. Complementary to vision application and plastic processing, these new application markets become more and more important and can now be addressed by high power diode laser line generators. Herewith, a family of high power diode laser line generators is presented that covers this wide spectrum of application fields with very different requirements, including new applications as cutting of silicon or glass, as well as the beam shaping concepts behind it. A laser that generates a 5m long and 4mm wide homogeneous laser line is shown with peak intensities of 0.2W/cm2 for inspection of railway catenaries as well as a laser that generates a homogeneous intensity distribution of 60mm x 2mm size with peak intensities of 225W/cm2 for plastic processing. For the annealing of silicon surfaces, a laser was designed that generates an extraordinary uniform intensity distribution with residual inhomogeneities (contrast ratio) of less than 3% over a line length of 11mm and peak intensities of up to 75kW/cm2. Ultimately, a laser line is shown with a peak intensity of 250kW/cm2 used for cutting applications. Results of various application tests performed with the above mentioned lasers are discussed, particularly the surface treatment of silicon and the cutting of glass.

  19. On the gate of Arctic footsteps: Doors open to foreign high schools

    Manno, C.; Pecchiar, I.

    2012-12-01

    With the increased attention on the changing Arctic Region effective science education, outreach and communication need to be higher priorities within the scientific communities. In order to encourage the dissemination of polar research at educational levels foreign high school students and teachers were visiting Tromso University for a week. The project highlights the role of the universities as link between research and outreach. The first aim of this project was to increase awareness of foreign schools on major topics concerning the Arctic issues (from the economic/social to the environmental/climatic point of view). Forty three Italian high school students were involved in the laboratory activities running at the UiT and participated in seminars. Topics of focus were Ocean Acidification, Global Warming and the combined effects with other anthropogenic stressors. During their stay, students interviewed several scientists in order to allow them to edit a "visiting report" and to elaborate all the material collected. Back in Italy they performed an itinerant exhibition (presentation of a short movie, posters, and pictures) in various Italian schools in order to pass on their Arctic education experience. The project highlights the role of University as communicator of "climate related issues" in the international frame of the "new generation" of students.

  20. A first course in probability

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  1. Lectures on probability and statistics

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  2. Large Eddy simulation of flat plate film cooling at high blowing ratio using open FOAM

    Baagherzadeh Hushmandi, Narmin

    2018-06-01

    In this work, numerical analysis was performed to predict the behaviour of high Reynolds number turbulent cross-flows used in film cooling applications. The geometry included one row of three discrete coolant holes inclined at 30 degrees to the main flow. In the computational model, the width of the channel was cut into one sixth and symmetry boundaries were applied in the centreline of the coolant hole and along the line of symmetry between two adjacent holes. One of the main factors that affect the performance of film cooling is the blowing ratio of coolant to the main flow. A blowing ratio equal to two was chosen in this study. Analysis showed that the common practice CFD models that employ RANS equations together with turbulence modelling under predict the film cooling effectiveness up to a factor of four. However, LES method showed better agreement of film cooling effectiveness both in tendency and absolute values compared with experimental results.

  3. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  4. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  5. Detection probability of Campylobacter

    Evers, E.G.; Post, J.; Putirulan, F.F.; Wal, van der F.J.

    2010-01-01

    A rapid presence/absence test for Campylobacter in chicken faeces is being evaluated to support the scheduling of highly contaminated broiler flocks as a measure to reduce public health risks [Nauta, M. J., & Havelaar, A. H. (2008). Risk-based standards for Campylobacter in the broiler meat

  6. Probability an introduction with statistical applications

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. High Efficiency Robust Open Tubular Capillary Electrochromatography Column for the Separation of Peptides

    Ali, Faiz; Cheong, Won Jo [Inha University, Incheon (Korea, Republic of)

    2016-08-15

    In this study, the carefully designed tri-component copolymer layer was fabricated onto the inner surface of a pretreated silica capillary (52 cm effective length, 50 μm id). The initiator moieties were incorporated onto the capillary inner surface by reaction with 4-chloromehtylphenyl isocyanate followed by sodium diethyl dithiocarbamate. Next, RAFT copolymerization was held upon the initiator moieties and a thin polymer film was made. The observed peak capacity was, of course, lower than those of the state-of-the art gradient HPLC systems. The UPLC system operated in the long gradient elution mode with a long narrow column of sub-3 μm packed particles could achieve the impressive high peak capacity of ca. 1000. On the other hand, a system with a 20 cm column of 0.8 μm particles could achieve a peak capacity of 220 (comparable to our result) under a pressure of 20 000 psi in a gradient time of 20 min. It should be noted that the operational conditions of this study has been optimized to obtain the best column separation efficiency. It was also operated in the isocratic elution mode. A better peak capacity is expected if the operational conditions are tuned to the optimum peak capacity.

  8. The high opening of the right bronchial artery with a non-typical course.

    Maciejewski, R; Madej, B; Anasiewicz, A

    1995-01-01

    Authors describing the bronchial vessels agree to the fact that they are characterised by a great variability in regard to their number and the place where they leave aorta (1, 2, 6). The characteristic feature of the right bronchial artery is that it often forms common trunks with other vessels (mainly with the first right aortic intercostal branch or with one of the upper oesophageal arteries). It can also have a common let-out trunk with the left upper bronchial artery (4). Bearing in mind that the operations on trachea and bronchi are difficult, and that it is very important to maintain the blood supply of the walls in the operated organs we have decided to publish our observations. They refer to a case, not described before, in which the right bronchial artery left the aortic arch in a high position making the vascular supply to the front lower half of the trachea and its bifurcation. Then, it went down to the membranous part of the right bronchus.

  9. Economic choices reveal probability distortion in macaque monkeys.

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  10. A brief introduction to probability.

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  11. Measuring Methane from Cars, Ships, Airplanes, Helicopters and Drones Using High-Speed Open-Path Technology

    Burba, George; Anderson, Tyler; Biraud, Sebastien; Caulton, Dana; von Fischer, Joe; Gioli, Beniamino; Hanson, Chad; Ham, Jay; Kohnert, Katrin; Larmanou, Eric; Levy, Peter; Polidori, Andrea; Pikelnaya, Olga; Sachs, Torsten; Serafimovich, Andrei; Zaldei, Alessandro; Zondlo, Mark; Zulueta, Rommel

    2017-04-01

    Methane plays a critical role in the radiation balance, chemistry of the atmosphere, and air quality. The major anthropogenic sources of methane include oil and gas development sites, natural gas distribution networks, landfill emissions, and agricultural production. The majority of oil and gas and urban methane emission occurs via variable-rate point sources or diffused spots in topographically challenging terrains (e.g., street tunnels, elevated locations at water treatment plants, vents, etc.). Locating and measuring such methane emissions is challenging when using traditional micrometeorological techniques, and requires development of novel approaches. Landfill methane emissions traditionally assessed at monthly or longer time intervals are subject to large uncertainties because of the snapshot nature of the measurements and the barometric pumping phenomenon. The majority of agricultural and natural methane production occurs in areas with little infrastructure or easily available grid power (e.g., rice fields, arctic and boreal wetlands, tropical mangroves, etc.). A lightweight, high-speed, high-resolution, open-path technology was recently developed for eddy covariance measurements of methane flux, with power consumption 30-150 times below other available technologies. It was designed to run on solar panels or a small generator and be placed in the middle of the methane-producing ecosystem without a need for grid power. Lately, this instrumentation has been utilized increasingly more frequently outside of the traditional use on stationary flux towers. These novel approaches include measurements from various moving platforms, such as cars, aircraft, and ships. Projects included mapping of concentrations and vertical profiles, leak detection and quantification, mobile emission detection from natural gas-powered cars, soil methane flux surveys, etc. This presentation will describe the latest state of the key projects utilizing the novel lightweight low-power high

  12. High-resolution paleolimnology opens new management perspectives for lakes adaptation to climate warming

    Marie-Elodie ePerga

    2015-07-01

    Full Text Available Varved lake sediments provide opportunities for high-resolution paleolimnological investigations that may extend monitoring surveys in order to target priority management actions under climate warming. This paper provides the synthesis of an international research program relying on >150 years-long, varved records for three managed perialpine lakes in Europe (Lakes Geneva, Annecy and Bourget. The dynamics of the dominant, local human pressures, as well as the ecological responses in the pelagic, benthic and littoral habitats were reconstructed using classical and newly developed paleo-proxies. Statistical modelling achieved the hierarchization of the drivers of their ecological trajectories. All three lakes underwent different levels of eutrophication in the first half of the XXth century, followed by re-oligotrophication. Climate warming came along with a 2°C increase in air temperature over the last century, to which lakes were unequally thermally vulnerable. Unsurprisingly, phosphorous concentration has been the dominant ecological driver over the last century. Yet, other human-influenced, local environmental drivers (fisheries management practices, river regulations have also significantly inflected ecological trajectories. Climate change has been impacting all habitats at rates that, in some cases, exceeded those of local factors. The amplitude and ecological responses to similar climate change varied between lakes, but, at least for pelagic habitats, rather depended on the intensity of local human pressures than on the thermal effect of climate change. Deep habitats yet showed higher sensitivity to climate change but substantial influence of river flows. As a consequence, adapted local management strategies, fully integrating nutrient inputs, fisheries management and hydrological regulations, may enable mitigating the deleterious consequences of ongoing climate change on these ecosystems.

  13. High-volume plasma exchange in patients with acute liver failure: An open randomised controlled trial.

    Larsen, Fin Stolze; Schmidt, Lars Ebbe; Bernsmeier, Christine; Rasmussen, Allan; Isoniemi, Helena; Patel, Vishal C; Triantafyllou, Evangelos; Bernal, William; Auzinger, Georg; Shawcross, Debbie; Eefsen, Martin; Bjerring, Peter Nissen; Clemmesen, Jens Otto; Hockerstedt, Krister; Frederiksen, Hans-Jørgen; Hansen, Bent Adel; Antoniades, Charalambos G; Wendon, Julia

    2016-01-01

    Acute liver failure (ALF) often results in cardiovascular instability, renal failure, brain oedema and death either due to irreversible shock, cerebral herniation or development of multiple organ failure. High-volume plasma exchange (HVP), defined as exchange of 8-12 or 15% of ideal body weight with fresh frozen plasma in case series improves systemic, cerebral and splanchnic parameters. In this prospective, randomised, controlled, multicentre trial we randomly assigned 182 patients with ALF to receive either standard medical therapy (SMT; 90 patients) or SMT plus HVP for three days (92 patients). The baseline characteristics of the groups were similar. The primary endpoint was liver transplantation-free survival during hospital stay. Secondary-endpoints included survival after liver transplantation with or without HVP with intention-to-treat analysis. A proof-of-principle study evaluating the effect of HVP on the immune cell function was also undertaken. For the entire patient population, overall hospital survival was 58.7% for patients treated with HVP vs. 47.8% for the control group (hazard ratio (HR), with stratification for liver transplantation: 0.56; 95% confidence interval (CI), 0.36-0.86; p=0.0083). HVP prior to transplantation did not improve survival compared with patients who received SMT alone (CI 0.37 to 3.98; p=0.75). The incidence of severe adverse events was similar in the two groups. Systemic inflammatory response syndrome (SIRS) and sequential organ failure assessment (SOFA) scores fell in the treated group compared to control group, over the study period (pHVP improves outcome in patients with ALF by increasing liver transplant-free survival. This is attributable to attenuation of innate immune activation and amelioration of multi-organ dysfunction. Copyright © 2015 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  14. Evaluation of e-liquid toxicity using an open-source high-throughput screening assay

    Keating, James E.; Zorn, Bryan T.; Kochar, Tavleen K.; Wolfgang, Matthew C.; Glish, Gary L.; Tarran, Robert

    2018-01-01

    The e-liquids used in electronic cigarettes (E-cigs) consist of propylene glycol (PG), vegetable glycerin (VG), nicotine, and chemical additives for flavoring. There are currently over 7,700 e-liquid flavors available, and while some have been tested for toxicity in the laboratory, most have not. Here, we developed a 3-phase, 384-well, plate-based, high-throughput screening (HTS) assay to rapidly triage and validate the toxicity of multiple e-liquids. Our data demonstrated that the PG/VG vehicle adversely affected cell viability and that a large number of e-liquids were more toxic than PG/VG. We also performed gas chromatography–mass spectrometry (GC-MS) analysis on all tested e-liquids. Subsequent nonmetric multidimensional scaling (NMDS) analysis revealed that e-liquids are an extremely heterogeneous group. Furthermore, these data indicated that (i) the more chemicals contained in an e-liquid, the more toxic it was likely to be and (ii) the presence of vanillin was associated with higher toxicity values. Further analysis of common constituents by electron ionization revealed that the concentration of cinnamaldehyde and vanillin, but not triacetin, correlated with toxicity. We have also developed a publicly available searchable website (www.eliquidinfo.org). Given the large numbers of available e-liquids, this website will serve as a resource to facilitate dissemination of this information. Our data suggest that an HTS approach to evaluate the toxicity of multiple e-liquids is feasible. Such an approach may serve as a roadmap to enable bodies such as the Food and Drug Administration (FDA) to better regulate e-liquid composition. PMID:29584716

  15. Knots: attractive places with high path tortuosity in mouse open field exploration.

    Anna Dvorkin

    2010-01-01

    Full Text Available When introduced into a novel environment, mammals establish in it a preferred place marked by the highest number of visits and highest cumulative time spent in it. Examination of exploratory behavior in reference to this "home base" highlights important features of its organization. It might therefore be fruitful to search for other types of marked places in mouse exploratory behavior and examine their influence on overall behavior.Examination of path curvatures of mice exploring a large empty arena revealed the presence of circumscribed locales marked by the performance of tortuous paths full of twists and turns. We term these places knots, and the behavior performed in them-knot-scribbling. There is typically no more than one knot per session; it has distinct boundaries and it is maintained both within and across sessions. Knots are mostly situated in the place of introduction into the arena, here away from walls. Knots are not characterized by the features of a home base, except for a high speed during inbound and a low speed during outbound paths. The establishment of knots is enhanced by injecting the mouse with saline and placing it in an exposed portion of the arena, suggesting that stress and the arousal associated with it consolidate a long-term contingency between a particular locale and knot-scribbling.In an environment devoid of proximal cues mice mark a locale associated with arousal by twisting and turning in it. This creates a self-generated, often centrally located landmark. The tortuosity of the path traced during the behavior implies almost concurrent multiple views of the environment. Knot-scribbling could therefore function as a way to obtain an overview of the entire environment, allowing re-calibration of the mouse's locale map and compass directions. The rich vestibular input generated by scribbling could improve the interpretation of the visual scene.

  16. EVALUATION OF FUNCTIONAL RESULTS OF MEDIAL OPENING WEDGE HIGH TIBIAL OSTEOTOMY FOR UNICOMPARTMENTAL OSTEOARTHRITIS VARUS KNEE

    Shyam Sundar Bakki

    2017-01-01

    Full Text Available BACKGROUND Osteoarthritis commonly affects the medial compartment of knee giving rise to varus deformity in majority of cases. Significant varus deformity further aggravates the pathology due to medialisation of the weight bearing line osteotomy of the proximal tibia realigns this weight bearing axis, thereby relieving pressure on the damaged medial compartment. OWHTO is a promising option in this scenario because it is associated with high accuracy in correcting the deformity and less number of complications when compared to lateral closing wedge HTO or UKA. In this study, we evaluate the functional outcome of HTO in patients with unicompartmental osteoarthritis. MATERIALS AND METHODS This is a prospective study of patients who attended the orthopaedic outpatient clinic in Government Hospital, Kakinada, between August 2013 to August 2015. The patients were evaluated by clinical examination and weight bearing radiographs. The patients who were found to have unicompartmental osteoarthritis with knee pain not relieved by conservative management and who satisfy the inclusion criteria were selected. RESULTS Excellent results can be achieved by appropriate selection criteria and planning with long limb weight bearing radiographs. There is an excellent relief of pain, which can be achieved within first few months postoperatively, which is assessed by VAS score. The KSS- knee score is excellent in 35%, good in 40%, fair in 20% and poor in 5%. The KSS- function score is excellent in 30%, good in 45%, fair in 20% and poor in 5%. There is significant improvement in the range of movement of the knee joint postoperatively. CONCLUSION In this study, we conclude that medial OWHTO is the preferred modality for unicompartmental OA in those aged <60 years and in developing nations like India where squatting is an important function, it has major role as it can restore near normal knee function without disturbing anatomy.

  17. Symmetry-Breaking Charge Transfer in a Zinc Chlorodipyrrin Acceptor for High Open Circuit Voltage Organic Photovoltaics

    Bartynski, Andrew N.; Gruber, Mark; Das, Saptaparna; Rangan, Sylvie; Mollinger, Sonya; Trinh, Cong; Bradforth, Stephen E.; Vandewal, Koen; Salleo, Alberto; Bartynski, Robert A.; Bruetting, Wolfgang; Thompson, Mark E.

    2015-01-01

    © 2015 American Chemical Society. Low open-circuit voltages significantly limit the power conversion efficiency of organic photovoltaic devices. Typical strategies to enhance the open-circuit voltage involve tuning the HOMO and LUMO positions

  18. Open access to high-level data and analysis tools in the CMS experiment at the LHC

    Calderon, A; Rodriguez-Marrero, A; Colling, D; Huffman, A; Lassila-Perini, K; McCauley, T; Rao, A; Sexton-Kennedy, E

    2015-01-01

    The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display and histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data; example code is provided. We describe the accompanying tools and documentation and discuss the first experiences of data use. (paper)

  19. Studying open innovation collaboration between the high-tech industry and science with linguistic ethnography : Battling over the status of knowledge in a setting of distrust

    De Maeijer, E.D.R.; Van Hout, T.; Weggeman, M.C.D.P.; Post, G.

    2016-01-01

    Open Innovation collaborations often pit academia against industry. Such inter-organizational collaborations can be troublesome due to different organizational backgrounds. This paper investigates what kind of knowledge a multinational high tech company and a research institute share with each

  20. Suppression of the high-frequency disturbances in low-voltage circuits caused by disconnector operation in high-voltage open-air substations

    Savic, M.S.

    1986-07-01

    The switching off and on of small capacitive currents charging busbar capacitances, connection conductors and open circuit breakers with disconnectors causes high-frequency transients in high-voltage networks. In low voltage circuits, these transient processes induce dangerous overvoltages for the electronic equipment in the substation. A modified construction of the disconnector with a damping resistor was investigated. Digital simulation of the transient process in a high-voltage network during the arcing period between the disconnector contacts with and without damping resistor were performed. A significant decrease of the arcing duration and the decrease of the electromagnetic field magnitude in the vicinity of the operating disconnector were noticed. In the low voltage circuit protected with the surge arrester, the overvoltage magnitude was not affected by the damping resistor due to the arrester protection effect.

  1. K-forbidden transition probabilities

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  2. High-Fidelity Simulation of Pediatric Emergency Care: An Eye-Opening Experience for Baccalaureate Nursing Students.

    Small, Sandra P; Colbourne, Peggy A; Murray, Cynthia L

    2018-01-01

    Background Little attention has been given to in-depth examination of what high-fidelity simulation is like for nursing students within the context of a pediatric emergency, such as a cardiopulmonary arrest. It is possible that such high-fidelity simulation could provoke in nursing students intense psychological reactions. Purpose The purpose of this study was to learn about baccalaureate nursing students' lived experience of high-fidelity simulation of pediatric cardiopulmonary arrest. Method Phenomenological methods were used. Twenty-four interviews were conducted with 12 students and were analyzed for themes. Results The essence of the experience is that it was eye-opening. The students found the simulation to be a surprisingly realistic nursing experience as reflected in their perceiving the manikin as a real patient, thinking that they were saving their patient's life, feeling like a real nurse, and feeling relief after mounting stress. It was a surprisingly valuable learning experience in that the students had an increased awareness of the art and science of nursing and increased understanding of the importance of teamwork and were feeling more prepared for clinical practice and wanting more simulation experiences. Conclusion Educators should capitalize on the benefits of high-fidelity simulation as a pedagogy, while endeavoring to provide psychologically safe learning.

  3. Theoretical analysis of open aperture reflection Z-scan on materials with high-order optical nonlinearities

    Petris, Adrian I.; Vlad, Valentin I.

    2010-03-01

    We present a theoretical analysis of open aperture reflection Z-scan in nonlinear media with third-, fifth-, and higher-order nonlinearities. A general analytical expression for the normalized reflectance when third-, fifth- and higher-order optical nonlinearities are excited is derived and its consequences on RZ-scan in media with high-order nonlinearities are discussed. We show that by performing RZ-scan experiments at different incident intensities it is possible to put in evidence the excitation of different order nonlinearities in the medium. Their contributions to the overall nonlinear response can be discriminated by using formulas derived by us. A RZ-scan numerical simulation using these formulas and data taken from literature, measured by another method for the third-, fifth-, and seventh-order nonlinear refractive indices of As 2 S 3 chalcogenide glass, is performed. (author)

  4. MR-Guided Laser Ablation of Osteoid Osteoma in an Open High-Field System (1.0 T)

    Streitparth, F.; Gebauer, B.; Melcher, I.; Schaser, K.; Philipp, C.; Rump, J.; Hamm, B.; Teichgraeber, U.

    2009-01-01

    Computed tomography is the standard imaging modality to minimize the extent of surgical or ablative treatment in osteoid osteomas. In the last 15 years, since a description of thermal ablation of osteoid osteomas was first published, this technique has become a treatment of choice for this tumor. We report the case of a 20-year-old man with an osteoid osteoma treated with laser ablation in an open high-field magnetic resonance imaging scanner (1.0 T). The tumor, located in the right fibula, was safely and effectively ablated under online monitoring. We describe the steps of this interventional procedure and discuss related innovative guidance and monitoring features and potential benefits compared with computed tomographic guidance.

  5. Propensity, Probability, and Quantum Theory

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  6. Image-guided spinal injection procedures in open high-field MRI with vertical field orientation: feasibility and technical features

    Streitparth, F.; Walter, T.; Wonneberger, U.; Wagner, M.; Hermann, K.G.; Hamm, B.; Teichgraeber, U. [Charite, Humboldt-Universitaet zu Berlin, Department of Radiology, Berlin (Germany); Chopra, S. [Charite-Universitaetsmedizin Berlin, Campus Virchow Klinikum, Department of General, Visceral, and Transplantation Surgery, Berlin (Germany); Wichlas, F. [Charite-Universitaetsmedizin Berlin, Campus Virchow Klinikum, Center for Musculoskeletal Surgery, Berlin (Germany)

    2010-02-15

    We prospectively evaluated the feasibility and technical features of MR-guided lumbosacral injection procedures in open high-field MRI at 1.0 T. In a CuSO{sub 4}.5H{sub 2}O phantom and five human cadaveric spines, fluoroscopy sequences (proton-density-weighted turbo spin-echo (PDw TSE), T1w TSE, T2w TSE; balanced steady-state free precession (bSSFP), T1w gradient echo (GE), T2w GE) were evaluated using two MRI-compatible 20-G Chiba-type needles. Artefacts were analysed by varying needle orientation to B{sub 0}, frequency-encoding direction and slice orientation. Image quality was described using the contrast-to-noise ratio (CNR). Subsequently, a total of 183 MR-guided nerve root (107), facet (53) and sacroiliac joint (23) injections were performed in 53 patients. In vitro, PDw TSE sequence yielded the best needle-tissue contrasts (CNR = 45, 18, 15, 9, and 8 for needle vs. fat, muscle, root, bone and sclerosis, respectively) and optimal artefact sizes (width and tip shift less than 5 mm). In vivo, PDw TSE sequence was sufficient in all cases. The acquisition time of 2 s facilitated near-real-time MRI guidance. Drug delivery was technically successful in 100% (107/107), 87% (46/53) and 87% (20/23) of nerve root, facet and sacroiliac joint injections, respectively. No major complications occurred. The mean procedure time was 29 min (range 19-67 min). MR-guided spinal injections in open high-field MRI are feasible and accurate using fast TSE sequence designs. (orig.)

  7. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  8. Failure probability analysis of optical grid

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  9. Prediction and probability in sciences

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  10. Carotid Stenting in Patients With High Risk Versus Standard Risk for Open Carotid Endarterectomy (REAL-1 Trial).

    De Haro, Joaquin; Michel, Ignacio; Bleda, Silvia; Cañibano, Cristina; Acin, Francisco

    2017-07-15

    Carotid stenting (CAS) has been mainly offered to those patients considered at "high risk" for open carotid endarterectomy based on available data from large randomized clinical trials. However, several recent studies have called medical "high risk" into question for CAS indication. The REAL-1 trial evaluated the safety and perioperative and long-term effectiveness in patients with significant carotid artery stenosis with "high-risk" criteria treated with CAS and proximal protection device (MOMA) compared with those with standard surgical-risk features. This nonrandomized double-arm registry included 125 patients (40% symptomatic), 71 (56%) with "standard-risk" and 54 (44%) with "high-risk" criteria. The primary end point was the cumulative incidence of any major adverse event, a composite of stroke, myocardial infarction, and death within 30 days after the intervention or ipsilateral stroke after 30 days and up to 4 years. There was no significant difference in primary end point rate at 30 days between patients at "standard risk" and those with "high risk" (1.4% vs 1.9% respectively; hazard ratio for "standard risk" 1.1; 95% CI 0.8 to 1.2, p = 0.77) nor estimated 4-year rate of ipsilateral stroke (1.3% vs 1.8%; hazard ratio for "standard risk" 1.05, 95% CI 0.86 to 1.14, p = 0.9). In conclusion, 4-year postprocedure results demonstrated that CAS with proximal device (MOMA) is safe and effective for patients with and without "high-risk" for carotid endarterectomy. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Poisson Processes in Free Probability

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Photoelectrochemical Complexes of Fucoxanthin-Chlorophyll Protein for Bio-Photovoltaic Conversion with a High Open-Circuit Photovoltage.

    Zhang, Tianning; Liu, Cheng; Dong, Wenjing; Wang, Wenda; Sun, Yan; Chen, Xin; Yang, Chunhong; Dai, Ning

    2017-12-05

    Open-circuit photovoltage (V oc ) is among the critical parameters for achieving an efficient light-to-charge conversion in existing solar photovoltaic devices. Natural photosynthesis exploits light-harvesting chlorophyll (Chl) protein complexes to transfer sunlight energy efficiently. We describe the exploitation of photosynthetic fucoxanthin-chlorophyll protein (FCP) complexes for realizing photoelectrochemical cells with a high V oc . An antenna-dependent photocurrent response and a V oc up to 0.72 V are observed and demonstrated in the bio-photovoltaic devices fabricated with photosynthetic FCP complexes and TiO 2 nanostructures. Such high V oc is determined by fucoxanthin in FCP complexes, and is rarely found in photoelectrochemical cells with other natural light-harvesting antenna. We think that the FCP-based bio-photovoltaic conversion will provide an opportunity to fabricate environmental benign photoelectrochemical cells with high V oc , and also help improve the understanding of the essential physics behind the light-to-charge conversion in photosynthetic complexes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Trifluoromethyl-Substituted Large Band-Gap Polytriphenylamines for Polymer Solar Cells with High Open-Circuit Voltages

    Shuwang Yi

    2018-01-01

    Full Text Available Two large band-gap polymers (PTPACF and PTPA2CF based on polytriphenylamine derivatives with the introduction of electron-withdrawing trifluoromethyl groups were designed and prepared by Suzuki polycondensation reaction. The chemical structures, thermal, optical and electrochemical properties were characterized in detail. From the UV-visible absorption spectra, the PTPACF and PTPA2CF showed the optical band gaps of 2.01 and 2.07 eV, respectively. The cyclic voltammetry (CV measurement displayed the deep highest occupied molecular orbital (HOMO energy levels of −5.33 and −5.38 eV for PTPACF and PTPA2CF, respectively. The hole mobilities, determined by field-effect transistor characterization, were 2.5 × 10−3 and 1.1 × 10−3 cm2 V−1 S−1 for PTPACF and PTPA2CF, respectively. The polymer solar cells (PSCs were tested under the conventional device structure of ITO/PEDOT:PSS/polymer:PC71BM/PFN/Al. All of the PSCs showed the high open circuit voltages (Vocs with the values approaching 1 V. The PTPACF and PTPA2CF based PSCs gave the power conversion efficiencies (PCEs of 3.24% and 2.40%, respectively. Hence, it is a reliable methodology to develop high-performance large band-gap polymer donors with high Vocs through the feasible side-chain modification.

  15. Determining probabilities of geologic events and processes

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  16. Detection of open water dynamics with ENVISAT ASAR in support of land surface modelling at high latitudes

    A. Bartsch

    2012-02-01

    Full Text Available Wetlands are generally accepted as being the largest but least well quantified single source of methane (CH4. The extent of wetland or inundation is a key factor controlling methane emissions, both in nature and in the parameterisations used in large-scale land surface and climate models. Satellite-derived datasets of wetland extent are available on the global scale, but the resolution is rather coarse (>25 km. The purpose of the present study is to assess the capability of active microwave sensors to derive inundation dynamics for use in land surface and climate models of the boreal and tundra environments. The focus is on synthetic aperture radar (SAR operating in C-band since, among microwave systems, it has comparably high spatial resolution and data availability, and long-term continuity is expected.

    C-band data from ENVISAT ASAR (Advanced SAR operating in wide swath mode (150 m resolution were investigated and an automated detection procedure for deriving open water fraction has been developed. More than 4000 samples (single acquisitions tiled onto 0.5° grid cells have been analysed for July and August in 2007 and 2008 for a study region in Western Siberia. Simple classification algorithms were applied and found to be robust when the water surface was smooth. Modification of input parameters results in differences below 1 % open water fraction. The major issue to address was the frequent occurrence of waves due to wind and precipitation, which reduces the separability of the water class from other land cover classes. Statistical measures of the backscatter distribution were applied in order to retrieve suitable classification data. The Pearson correlation between each sample dataset and a location specific representation of the bimodal distribution was used. On average only 40 % of acquisitions allow a separation of the open water class. Although satellite data are available every 2–3 days over the Western Siberian

  17. Reliability of the imaging software in the preoperative planning of the open-wedge high tibial osteotomy.

    Lee, Yong Seuk; Kim, Min Kyu; Byun, Hae Won; Kim, Sang Bum; Kim, Jin Goo

    2015-03-01

    The purpose of this study was to verify a recently developed picture-archiving and communications system-photoshop method by comparing reliabilities between real-size paper template and the PACS-photoshop methods in preoperative planning of open-wedge high tibial osteotomy. A prospective case series was conducted, including patients with medial osteoarthritis undergoing open-wedge high tibial osteotomy. In the preoperative planning, the picture-archiving and communications system-photoshop method and real-size paper template method were used simultaneously in all patients. Preoperative hip-knee-ankle angle, height, and angle of the osteotomy were evaluated. The reliability of this newly devised method was evaluated, and the consistency between the two methods was also evaluated using intra-class correlation coefficient. Using the picture-archiving and communications system-photoshop method, the mean correction angle and height of osteotomy gap of rater-1 were 11.7° ± 3.6° and 10.7 ± 3.6 mm, respectively. The mean correction angle and height of osteotomy gap of rater-2 were 12.0 ± 2.6 and 10.8 ± 3.6, respectively. The inter- and intra-rater reliabilities of the correction angle were 0.956 ~ 0.979 and 0.980 ~ 0.992, respectively. The inter- and intra-rater reliabilities of the height of the osteotomy gap were 0.968 ~ 0.985 and 0.971 ~ 0.994, respectively (p photoshop method, mean values of the correction angle and height of the osteotomy gap were 11.9° ± 3.6° and 10.8 ± 3.6 mm, respectively. Consistency between the two methods by comparing the means of the correction angle and the height of the osteotomy gap were 0.985 and 0.985, respectively (p photoshop method enables direct measurement of the height of the osteotomy gap with high reliability.

  18. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    David K Brown

    Full Text Available Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS, a workflow management system and web interface for high performance computing (HPC. JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  19. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  20. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  1. Probability inequalities for decomposition integrals

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  2. Expected utility with lower probabilities

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  3. Invariant probabilities of transition functions

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  4. Introduction to probability with Mathematica

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  5. Linear positivity and virtual probability

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  6. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  7. Opening lecture

    Thomas, J.B.

    1997-01-01

    The opening lecture on the results of fifty years in the nuclear energy field, deals with the main principles underlying the CEA policy concerning the fission nuclear energy transformation, i.e. the design of a nuclear industry that is a safe, high-performance and reliable source of electric power, the development of an adaptive power generation tool with the capacity to progress according to new constraints, and the necessary anticipation for preparing to the effects of the next 50 year technological leaps

  8. Optimization of Rear Local Al-Contacts on High Efficiency Commercial PERC Solar Cells with Dot and Line Openings

    Peisheng Liu

    2014-01-01

    Full Text Available Crystalline silicon PERCs with dot or line openings on rear surface were studied here. By measuring the minor carrier lifetimes of the PERCs with dot and line openings, passivation effects of rear surface with dot and line openings were discussed. The performance affected by dot and line openings was analyzed in detail by testing the open-circuit voltages, short-circuit current densities, fill factors, and conversion efficiencies of the PERCs. The results show that the wider space resulted in better minor carrier lifetimes on the rear surface. And the cells with a line opening space of 0.5 mm had an average of 0.22% improvement of conversion efficiency, compared with the cells with full-area Al-BSF. On the other hand, the dot opening PERCs exhibited only a conversion efficiency of 17.4%, although there had been good rear surface reflectivity. The bad Al-Si alloy layer and large hollow densities in dot Al-contacts resulted in bad performance of the PERCs with dot openings.

  9. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  10. Probable Inference and Quantum Mechanics

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  11. Failure probability under parameter uncertainty.

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  12. Open Science Training Handbook

    Sonja Bezjak; April Clyburne-Sherin; Philipp Conzett; Pedro Fernandes; Edit Görögh; Kerstin Helbig; Bianca Kramer; Ignasi Labastida; Kyle Niemeyer; Fotis Psomopoulos; Tony Ross-Hellauer; René Schneider; Jon Tennant; Ellen Verbakel; Helene Brinken

    2018-01-01

    For a readable version of the book, please visit https://book.fosteropenscience.eu A group of fourteen authors came together in February 2018 at the TIB (German National Library of Science and Technology) in Hannover to create an open, living handbook on Open Science training. High-quality trainings are fundamental when aiming at a cultural change towards the implementation of Open Science principles. Teaching resources provide great support for Open Science instructors and trainers. The ...

  13. Genefer: Programs for Finding Large Probable Generalized Fermat Primes

    Iain Arthur Bethune

    2015-11-01

    Full Text Available Genefer is a suite of programs for performing Probable Primality (PRP tests of Generalised Fermat numbers 'b'2'n'+1 (GFNs using a Fermat test. Optimised implementations are available for modern CPUs using single instruction, multiple data (SIMD instructions, as well as for GPUs using CUDA or OpenCL. Genefer has been extensively used by PrimeGrid – a volunteer computing project searching for large prime numbers of various kinds, including GFNs. Genefer’s architecture separates the high level logic such as checkpointing and user interface from the architecture-specific performance-critical parts of the implementation, which are suitable for re-use. Genefer is released under the MIT license. Source and binaries are available from www.assembla.com/spaces/genefer.

  14. A philosophical essay on probabilities

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  15. Stress analysis of the tibial plateau according to the difference of blade path entry in opening wedge high tibial osteotomy

    Lee, Jun Woo; Xin, YuanZhu; Yang, Seok Jo [Chungnam National University, Daejeon (Korea, Republic of); Ji, Jong Hun; Panchal, Karnav; Kwon, Oh Soo [The Catholic University of Korea, Daejeon (Korea, Republic of)

    2015-03-15

    High tibial osteotomy (HTO) has been used to successfully treat patients with genu varus deformities that can improve mechanical function and condition in the knee joint. Clinical studies have reported that bow legs often occur with a concentrated load on the varus of the tibia. This study aimed to analyze and verify the clinical test data result by utilizing the three-dimensional (3D) static finite element method (FEM). The 3D model of lower extremities, which include the femur, tibia, meniscus, and knee articular cartilage, was created using the images from a computer tomography scan and magnetic resonance imaging. In this report, we compared changes in stress distribution and force reaction on the tibial plateau because of critical problems caused by unexpected changes in the tibial posterior-slope angle because of HTO. The results showed that the 5 .deg. wedge-angle virtual opening wedge HTO without and with the posterior-slope angle shows has a load concentration of approximately 60% and 45% in the medial region, respectively.

  16. Diploma in Seismology for High-School Teachers in Mexico Through an Open-Source Learning Plataform

    Perez-Campos, X.; Bello, D.; Dominguez, J.; Pérez, J.; Cruz, J. L.; Navarro Estrada, F.; Mendoza Carvajal, A. D. J.

    2017-12-01

    The high school Physics programs in Mexico do not consider the immediate application of the concepts learned by the students. According to some pedagogical theories many of the acquired knowledge are assimilated when experimenting, expressing, interacting and developing projects. It is in high school when young people are exploring and looking for experiences to decide the area in which they want to focus their studies. The areas of science and engineering are chosen, mainly motivated by technology and outer space. There is little interest in Earth science, reflected by the number of students in those areas. This may be due mainly to the lack of exposure and examples at the high school level. With this in mind, we are working on a project that seeks, through the preparation of teachers of this level, to bring their students to seismology and awaken in them their curiosity in issues related to it. Based on the above, and taking as examples the successful programs "Seismographs in Schools" from IRIS and "Geoscience Information For Teachers" from EGU, the Mexican National Seismological Service has launched a project that contemplates three stages. The first one consists of the design and delivery of a diploma addressed to high school teachers. The second contemplates the installation of short-period seismographs in each of the participating faculty facilities. Finally, the third one involves the active participation of teachers and their students in research projects based on the data collected in the instruments installed in their schools. This work presents the first phase. The diploma has been designed to offer teachers, in 170 hours, an introduction to topics related to seismology and to provide them with tools and examples that they can share with their students in their classroom. It is offered both online through Moodle, an open-source learning plataform, and in 12 classroom sessions. The first class started on June 2017 and will finish on November 2017. We

  17. High frequency jet ventilation and intermittent positive pressure ventilation. Effect of cerebral blood flow in patients after open heart surgery

    Pittet, J.F.; Forster, A.; Suter, P.M.

    1990-01-01

    Attenuation of ventilator-synchronous pressure fluctuations of intracranial pressure has been demonstrated during high frequency ventilation in animal and human studies, but the consequences of this effect on cerebral blood flow have not been investigated in man. We compared the effects of high frequency jet ventilation and intermittent positive pressure ventilation on CBF in 24 patients investigated three hours after completion of open-heart surgery. The patients were investigated during three consecutive periods with standard sedation (morphine, pancuronium): a. IPPV; b. HFJV; c. IPPV. Partial pressure of arterial CO 2 (PaCO 2 : 4.5-5.5 kPa) and rectal temperature (35.5 to 37.5 degree C) were maintained constant during the study. The CBF was measured by intravenous 133 Xe washout technique. The following variables were derived from the cerebral clearance of 133 Xe: the rapid compartment flow, the initial slope index, ie, a combination of the rapid and the slow compartment flows, and the ratio of fast compartment flow over total CBF (FF). Compared to IPPV, HFJV applied to result in the same mean airway pressure did not produce any change in pulmonary gas exchange, mean systemic arterial pressure, and cardiac index. Similarly, CBF was not significantly altered by HFJV. However, important variations of CBF values were observed in three patients, although the classic main determinants of CBF (PaCO 2 , cerebral perfusion pressure, Paw, temperature) remained unchanged. Our results suggest that in patients with normal systemic hemodynamics, the effects of HFJV and IPPV on CBF are comparable at identical levels of mean airway pressure

  18. Logic, probability, and human reasoning.

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Introduction to probability and measure

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  20. Joint probabilities and quantum cognition

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  1. Joint probabilities and quantum cognition

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  2. Default probabilities and default correlations

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  3. The Probabilities of Unique Events

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  4. Probability Matching, Fast and Slow

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  5. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  6. Probability & Statistics: Modular Learning Exercises. Student Edition

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  7. Jump probabilities in the non-Markovian quantum jump method

    Haerkoenen, Kari

    2010-01-01

    The dynamics of a non-Markovian open quantum system described by a general time-local master equation is studied. The propagation of the density operator is constructed in terms of two processes: (i) deterministic evolution and (ii) evolution of a probability density functional in the projective Hilbert space. The analysis provides a derivation for the jump probabilities used in the recently developed non-Markovian quantum jump (NMQJ) method (Piilo et al 2008 Phys. Rev. Lett. 100 180402).

  8. Probably not future prediction using probability and statistical inference

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Long open-path TDL based system for monitoring background concentration for deployment at Jungfraujoch High Altitude Research Station- Switzerland

    Simeonov, Valentin; van den Bergh, Hubert; Parlange, Marc

    2010-05-01

    A new, long open-path instrument for monitoring of path-averaged methane and water vapor concentrations will be presented. The instrument is built on the monostatic scheme (transceiver - distant retroreflector). A VCSEL tunable diode laser (TDL) with a central wavelength of 1654 nm is used as a light source. A specially designed, single-cell, hollow-cube retroreflector with 150 mm aperture will be installed at 1200 m from the transceiver in the final deployment at Jungfraujjoch and 100 mm retroreflectors will be used in the other applications. The receiver is built around a 20 cm Newtonian telescope. To avoid distortions in the shape of a methane line, caused by atmospheric turbulences, the line is scanned within 1 µs. Fast InGaAs photodiodes and 200 MHz are used to achieve this scanning rate. The expected concentration resolution for the above mentioned path lengths is of the order of 2 ppb. The instrument is developed at the Swiss Federal Institute of Technology - Lausanne (EPFL) Switzerland and will be used within the GAW+ CH program for long-term monitoring of background methane concentration in the Swiss Alps. After completing the initial tests at EPFL the instrument will be installed in 2012 at the High Altitude Research Station Jungfraujoch (HARSJ) located at 3580 m ASL. The HARSJ is one of the 24 global GAW stations and carries on continuous observations of a number of trace gasses, including methane. One of the goals of the project is to compare path-averaged to ongoing point measurements of methane in order to identify possible influence of the station. Future deployments of a copy of the instrument include the Colombian part of Amazonia and Siberian wetlands.

  10. A predictive factor for acquiring an ideal lower limb realignment after opening-wedge high tibial osteotomy.

    Bito, Haruhiko; Takeuchi, Ryohei; Kumagai, Ken; Aratake, Masato; Saito, Izumi; Hayashi, Riku; Sasaki, Yohei; Aota, Yoichi; Saito, Tomoyuki

    2009-04-01

    Obtaining a correct postoperative limb alignment is an important factor in achieving a successful clinical outcome after an opening-wedge high tibial osteotomy (OWHTO). To better predict some of the aspects that impact upon the clinical outcomes following this procedure, including postoperative correction loss and over correction, we examined the changes in the frontal plane of the lower limb in a cohort of patients who had undergone OWHTO using radiography. Forty-two knees from 33 patients (23 cases of osteoarthritis and 10 of osteonecrosis) underwent a valgus realignment OWHTO procedure and were radiographically assessed for changes that occurred pre- and post-surgery. The mean femorotibial angle (FTA) was found to be 182.1 +/- 2.0 degrees (12 +/- 2.0 anatomical varus angulation) preoperatively and 169.6 +/- 2.4 degrees (10.4 +/- 2.4 anatomical valgus angulation) postoperatively. These measurements thus revealed significant changes in the weight bearing line ratio (WBL), femoral axis angle (FA), tibial axis angle (TA), tibia plateau angle (TP), tibia vara angle (TV) and talar tilt angle (TT) following OWHTO. In contrast, no significant change was found in the weight bearing line angle (WBLA) after these treatments. To assess the relationship between the correction angle and these indexes, 42 knees were divided into the following three groups according to the postoperative FTA; a normal correction group (168 degrees FTA FTA FTA > 172 degrees ). There were significant differences in the delta angle [DA; calculated as (pre FTA - post FTA) - (pre TV - post TV)] among each group of patients. Our results thus indicate a negative correlation between the DA and preoperative TA (R(2) = 0.148, p < 0.05). Hence, given that the correction errors in our patients appear to negatively correlate with the preoperative TA, postoperative malalignments are likely to be predictable prior to surgery.

  11. The Fire INventory from NCAR (FINN: a high resolution global model to estimate the emissions from open burning

    C. Wiedinmyer

    2011-07-01

    Full Text Available The Fire INventory from NCAR version 1.0 (FINNv1 provides daily, 1 km resolution, global estimates of the trace gas and particle emissions from open burning of biomass, which includes wildfire, agricultural fires, and prescribed burning and does not include biofuel use and trash burning. Emission factors used in the calculations have been updated with recent data, particularly for the non-methane organic compounds (NMOC. The resulting global annual NMOC emission estimates are as much as a factor of 5 greater than some prior estimates. Chemical speciation profiles, necessary to allocate the total NMOC emission estimates to lumped species for use by chemical transport models, are provided for three widely used chemical mechanisms: SAPRC99, GEOS-CHEM, and MOZART-4. Using these profiles, FINNv1 also provides global estimates of key organic compounds, including formaldehyde and methanol. Uncertainties in the emissions estimates arise from several of the method steps. The use of fire hot spots, assumed area burned, land cover maps, biomass consumption estimates, and emission factors all introduce error into the model estimates. The uncertainty in the FINNv1 emission estimates are about a factor of two; but, the global estimates agree reasonably well with other global inventories of biomass burning emissions for CO, CO2, and other species with less variable emission factors. FINNv1 emission estimates have been developed specifically for modeling atmospheric chemistry and air quality in a consistent framework at scales from local to global. The product is unique because of the high temporal and spatial resolution, global coverage, and the number of species estimated. FINNv1 can be used for both hindcast and forecast or near-real time model applications and the results are being critically evaluated with models and observations whenever possible.

  12. Esophagectomy - open

    ... Lewis esophagectomy, Blunt esophagectomy; Esophageal cancer - esophagectomy - open; Cancer of the esophagus - esophagectomy - open ... lining of the esophagus that can lead to cancer ( Barrett esophagus ) Severe trauma Destroyed esophagus Severely damaged stomach

  13. Opening Talk: Opening Talk

    Doebner, H.-D.

    2008-02-01

    Ladies and Gentlemen Dear Friends and Colleagues I welcome you at the 5th International Symposium `Quantum Theory and Symmetries, QTS5' in Valladolid as Chairman of the Conference Board of this biannual series. The aim of the series is to arrange an international meeting place for scientists working in theoretical and mathematical physics, in mathematics, in mathematical biology and chemistry and in other sciences for the presentation and discussion of recent developments in connection with quantum physics and chemistry, material science and related further fields, like life sciences and engineering, which are based on mathematical methods which can be applied to model and to understand microphysical and other systems through inherent symmetries in their widest sense. These systems include, e.g., foundations and extensions of quantum theory; quantum probability; quantum optics and quantum information; the description of nonrelativistic, finite dimensional and chaotic systems; quantum field theory, particle physics, string theory and quantum gravity. Symmetries in their widest sense describe properties of a system which could be modelled, e.g., through geometry, group theory, topology, algebras, differential geometry, noncommutative geometry, functional analysis and approximation methods; numerical evaluation techniques are necessary to connect such symmetries with experimental results. If you ask for a more detailed characterisation of this notion a hand waving indirect answer is: Collect titles and contents of the contributions of the proceedings of QTS4 and get a characterisation through semantic closure. Quantum theory and its Symmetries was and is a diversified and rapidly growing field. The number of and the types of systems with an internal symmetry and the corresponding mathematical models develop fast. This is reflected in the content of the five former international symposia of this series: The first symposium, QTS1-1999, was organized in Goslar (Germany

  14. Normal probability plots with confidence.

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Policy on synthetic biology: deliberation, probability, and the precautionary paradox.

    Wareham, Christopher; Nardini, Cecilia

    2015-02-01

    Synthetic biology is a cutting-edge area of research that holds the promise of unprecedented health benefits. However, in tandem with these large prospective benefits, synthetic biology projects entail a risk of catastrophic consequences whose severity may exceed that of most ordinary human undertakings. This is due to the peculiar nature of synthetic biology as a 'threshold technology' which opens doors to opportunities and applications that are essentially unpredictable. Fears about these potentially unstoppable consequences have led to declarations from civil society groups calling for the use of a precautionary principle to regulate the field. Moreover, the principle is prevalent in law and international agreements. Despite widespread political recognition of a need for caution, the precautionary principle has been extensively criticized as a guide for regulatory policy. We examine a central objection to the principle: that its application entails crippling inaction and incoherence, since whatever action one takes there is always a chance that some highly improbable cataclysm will occur. In response to this difficulty, which we call the 'precautionary paradox,' we outline a deliberative means for arriving at threshold of probability below which potential dangers can be disregarded. In addition, we describe a Bayesian mechanism with which to assign probabilities to harmful outcomes. We argue that these steps resolve the paradox. The rehabilitated PP can thus provide a viable policy option to confront the uncharted waters of synthetic biology research. © 2013 John Wiley & Sons Ltd.

  16. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  17. OpenSubspace

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering and projected clustering are recent research areas for clustering in high dimensional spaces. As the field is rather young, there is a lack of comparative studies on the advantages and disadvantages of the different algorithms. Part of the underlying problem is the lack...... of available open source implementations that could be used by researchers to understand, compare, and extend subspace and projected clustering algorithms. In this paper, we discuss the requirements for open source evaluation software. We propose OpenSubspace, an open source framework that meets...... these requirements. OpenSubspace integrates state-of-the-art performance measures and visualization techniques to foster research in subspace and projected clustering....

  18. Open access

    Valkenburg, P.M.

    2015-01-01

    Open access week Van 19 tot en met 25 oktober 2015 vond wereldwijd de Open Access Week plaats. Tijdens deze week werden er over de hele wereld evenementen georganiseerd waar open access een rol speelt. Ook in Nederland zijn er diverse symposia, workshops en debatten georganiseerd zoals het debat in

  19. Probability theory a foundational course

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  20. Approximation methods in probability theory

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  1. Model uncertainty: Probabilities for models?

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  2. Knowledge typology for imprecise probabilities.

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  3. Probability, Statistics, and Stochastic Processes

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  4. Statistical probability tables CALENDF program

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  5. Probability, statistics, and queueing theory

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  6. Probability and Statistics: 5 Questions

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  7. Comparing open innovation of innovative food SMEs with SMEs in the seed and high- tech industries - an analysis of 15 SMEs in the Netherlands

    Omta, Onno; Fortuin, Frances; Dijkman, Niels

    2018-01-01

    Various studies have shown that open innovation (OI) has become a basic requirement for the long-term survival of high-tech companies. However, also in an artisanal sector like the food industry OI has become increasingly important. To discover the extent to which innovative food and seed

  8. “Opening” a New Kind of School: The Story of the Open High School of Utah

    DeLaina Tonks

    2013-03-01

    Full Text Available The use of online learning at the primary and secondary school level is growing exponentially in the United States. Much of this growth is with full-time online schools, most of which are operated by for-profit companies that use proprietary online course content. In this article we trace the development of, and philosophy behind, a full-time online school that uses open access software and open educational resources for course content. As more nations begin to put in place plans for primary and secondary education in the event of natural disasters (e.g., the Christchurch earthquakes or pandemics (e.g., avian flu or H1N1, the availability of open online content is of critical importance.

  9. Ultra high open circuit voltage (>1 V) of poly-3-hexylthiophene based organic solar cells with concentrated light

    Tromholt, Thomas; Madsen, Morten Vesterager; Krebs, Frederik C

    2013-01-01

    to 2000 solar intensities of these photoactive blends. Comparison of solar cells based on five different fullerene derivatives shows that at both short circuit and open circuit conditions, recombination remains unchanged up to 50 suns. Determination of Voc at 2000 suns demonstrated that the same......One approach to increasing polymer solar cell efficiency is to blend poly-(3-hexyl-thiophene) with poorly electron accepting fullerene derivatives to obtain higher open circuit voltage (Voc). In this letter concentrated light is used to study the electrical properties of cell operation at up...

  10. Open-ended Laboratory Investigations in a High School Physics Course: The difficulties and rewards of implementing inquiry-based learning in a physics lab

    Szott, Aaron

    2014-01-01

    often closed-ended. The outcomes are known in advance and students replicate procedures recommended by the teacher. Over the years, I have come to appreciate the great opportunities created by allowing students investigative freedom in physics laboratories. I have realized that a laboratory environment in which students are free to conduct investigations using procedures of their own design can provide them with varied and rich opportunities for discovery. This paper describes what open-ended laboratory investigations have added to my high school physics classes. I will provide several examples of open-ended laboratories and discuss the benefits they conferred on students and teacher alike.

  11. Open Content in Open Context

    Kansa, Sarah Whitcher; Kansa, Eric C.

    2007-01-01

    This article presents the challenges and rewards of sharing research content through a discussion of Open Context, a new open access data publication system for field sciences and museum collections. Open Context is the first data repository of its kind, allowing self-publication of research data, community commentary through tagging, and clear…

  12. Dynamic SEP event probability forecasts

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  13. Conditional Independence in Applied Probability.

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  14. Stretching Probability Explorations with Geoboards

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  15. GPS: Geometry, Probability, and Statistics

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  16. Swedish earthquakes and acceleration probabilities

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  17. DECOFF Probabilities of Failed Operations

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  18. Risk estimation using probability machines

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  19. Probability and statistics: A reminder

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  20. Nash equilibrium with lower probabilities

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...