WorldWideScience

Sample records for maximum expected background

  1. How long do centenarians survive? Life expectancy and maximum lifespan.

    Science.gov (United States)

    Modig, K; Andersson, T; Vaupel, J; Rau, R; Ahlbom, A

    2017-08-01

    The purpose of this study was to explore the pattern of mortality above the age of 100 years. In particular, we aimed to examine whether Scandinavian data support the theory that mortality reaches a plateau at particularly old ages. Whether the maximum length of life increases with time was also investigated. The analyses were based on individual level data on all Swedish and Danish centenarians born from 1870 to 1901; in total 3006 men and 10 963 women were included. Birth cohort-specific probabilities of dying were calculated. Exact ages were used for calculations of maximum length of life. Whether maximum age changed over time was analysed taking into account increases in cohort size. The results confirm that there has not been any improvement in mortality amongst centenarians in the past 30 years and that the current rise in life expectancy is driven by reductions in mortality below the age of 100 years. The death risks seem to reach a plateau of around 50% at the age 103 years for men and 107 years for women. Despite the rising life expectancy, the maximum age does not appear to increase, in particular after accounting for the increasing number of individuals of advanced age. Mortality amongst centenarians is not changing despite improvements at younger ages. An extension of the maximum lifespan and a sizeable extension of life expectancy both require reductions in mortality above the age of 100 years. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  2. Scalar field vacuum expectation value induced by gravitational wave background

    Science.gov (United States)

    Jones, Preston; McDougall, Patrick; Ragsdale, Michael; Singleton, Douglas

    2018-06-01

    We show that a massless scalar field in a gravitational wave background can develop a non-zero vacuum expectation value. We draw comparisons to the generation of a non-zero vacuum expectation value for a scalar field in the Higgs mechanism and with the dynamical Casimir vacuum. We propose that this vacuum expectation value, generated by a gravitational wave, can be connected with particle production from gravitational waves and may have consequences for the early Universe where scalar fields are thought to play an important role.

  3. College for some to college for all: social background, occupational expectations, and educational expectations over time.

    Science.gov (United States)

    Goyette, Kimberly A

    2008-06-01

    The educational expectations of 10th-graders have dramatically increased from 1980 to 2002. Their rise is attributable in part to the changing educational composition of students' parents and related to the educational profiles of their expected occupations. Students whose parents have gone to college are more likely to attend college themselves, and students expect occupations that are more prestigious in 2002 than in 1980. The educational requirements of particular occupation categories have risen only slightly. These analyses also reveal that educational expectations in recent cohorts are more loosely linked to social background and occupational plans than they were in 1980. The declining importance of parents' background and the decoupling of educational and occupational plans, in addition to a strong and significant effect of cohort on educational expectations, suggest that the expectation of four-year college attainment is indeed becoming the norm.

  4. Text Processing: The Role of Reader Expectations and Background Knowledge.

    Science.gov (United States)

    1987-08-01

    essay test were expected, but spend more time processing lower-- level information than if a recognition test were expected. Furthermore, processing ...shifts ii tle amount of time devoted to reading information at various levels in a t.x, structure, rather than dramatic differences in processing patt...structures ( Craik & Lockhart , 1972; Goetz, Schallert, Reynolds, & Radin, 1983). If new information is compatible with existing memory structures, it is

  5. Microarray background correction: maximum likelihood estimation for the normal-exponential convolution

    DEFF Research Database (Denmark)

    Silver, Jeremy D; Ritchie, Matthew E; Smyth, Gordon K

    2009-01-01

    exponentially distributed, representing background noise and signal, respectively. Using a saddle-point approximation, Ritchie and others (2007) found normexp to be the best background correction method for 2-color microarray data. This article develops the normexp method further by improving the estimation...... is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. "MLE" is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data...

  6. MAXIMUM NUMBER OF REPETITIONS, TOTAL WEIGHT LIFTED AND NEUROMUSCULAR FATIGUE IN INDIVIDUALS WITH DIFFERENT TRAINING BACKGROUNDS

    Directory of Open Access Journals (Sweden)

    Valeria Panissa

    2013-04-01

    Full Text Available The aim of this study was to evaluate the performance, as well as neuromuscular activity, in a strength task in subjects with different training backgrounds. Participants (n = 26 were divided into three groups according to their training backgrounds (aerobic, strength or mixed and submitted to three sessions: (1 determination of the maximum oxygen uptake during the incremental treadmill test to exhaustion and familiarization of the evaluation of maximum strength (1RM for the half squat; (2 1RM determination; and (3 strength exercise, four sets at 80�0of the 1RM, in which the maximum number of repetitions (MNR, the total weight lifted (TWL, the root mean square (RMS and median frequency (MF of the electromyographic (EMG activity for the second and last repetition were computed. There was an effect of group for MNR, with the aerobic group performing a higher MNR compared to the strength group (P = 0.045, and an effect on MF with a higher value in the second repetition than in the last repetition (P = 0.016. These results demonstrated that individuals with better aerobic fitness were more fatigue resistant than strength trained individuals. The absence of differences in EMG signals indicates that individuals with different training backgrounds have a similar pattern of motor unit recruitment during a resistance exercise performed until failure, and that the greater capacity to perform the MNR probably can be explained by peripheral adaptations.

  7. Praying for Mr. Right? Religion, Family Background, and Marital Expectations among College Women

    Science.gov (United States)

    Ellison, Christopher G.; Burdette, Amy M.; Glenn, Norval D.

    2011-01-01

    This study explores the relationship between multiple aspects of religious involvement--affiliation, church attendance, subjective religiosity--and marital expectations among college women. In addition, the authors investigate whether religious involvement mediates the link between family background and marital expectations. These issues are…

  8. Effect of background music on maximum acceptable weight of manual lifting tasks.

    Science.gov (United States)

    Yu, Ruifeng

    2014-01-01

    This study used the psychophysical approach to investigate the impact of tempo and volume of background music on the maximum acceptable weight of lift (MAWL), heart rate (HR) and rating of perceived exertion (RPE) of participants engaged in lifting. Ten male college students participated in this study. They lifted a box from the floor, walked 1-2 steps as required, placed the box on a table and walked back twice per minute. The results showed that the tempo of music had a significant effect on both MAWL and HR. Fast tempo background music resulted in higher MAWL and HR values than those resulting from slow tempo music. The effects of both the tempo and volume on the RPE were insignificant. The results of this study suggest fast tempo background music may be used in manual materials handling tasks to increase performance without increasing perceived exertion because of its ergogenic effect on human psychology and physiology.

  9. Maximum Simulated Likelihood and Expectation-Maximization Methods to Estimate Random Coefficients Logit with Panel Data

    DEFF Research Database (Denmark)

    Cherchi, Elisabetta; Guevara, Cristian

    2012-01-01

    with cross-sectional or with panel data, and (d) EM systematically attained more efficient estimators than the MSL method. The results imply that if the purpose of the estimation is only to determine the ratios of the model parameters (e.g., the value of time), the EM method should be preferred. For all......The random coefficients logit model allows a more realistic representation of agents' behavior. However, the estimation of that model may involve simulation, which may become impractical with many random coefficients because of the curse of dimensionality. In this paper, the traditional maximum...... simulated likelihood (MSL) method is compared with the alternative expectation- maximization (EM) method, which does not require simulation. Previous literature had shown that for cross-sectional data, MSL outperforms the EM method in the ability to recover the true parameters and estimation time...

  10. Wobbling and LSF-based maximum likelihood expectation maximization reconstruction for wobbling PET

    International Nuclear Information System (INIS)

    Kim, Hang-Keun; Son, Young-Don; Kwon, Dae-Hyuk; Joo, Yohan; Cho, Zang-Hee

    2016-01-01

    Positron emission tomography (PET) is a widely used imaging modality; however, the PET spatial resolution is not yet satisfactory for precise anatomical localization of molecular activities. Detector size is the most important factor because it determines the intrinsic resolution, which is approximately half of the detector size and determines the ultimate PET resolution. Detector size, however, cannot be made too small because both the decreased detection efficiency and the increased septal penetration effect degrade the image quality. A wobbling and line spread function (LSF)-based maximum likelihood expectation maximization (WL-MLEM) algorithm, which combined the MLEM iterative reconstruction algorithm with wobbled sampling and LSF-based deconvolution using the system matrix, was proposed for improving the spatial resolution of PET without reducing the scintillator or detector size. The new algorithm was evaluated using a simulation, and its performance was compared with that of the existing algorithms, such as conventional MLEM and LSF-based MLEM. Simulations demonstrated that the WL-MLEM algorithm yielded higher spatial resolution and image quality than the existing algorithms. The WL-MLEM algorithm with wobbling PET yielded substantially improved resolution compared with conventional algorithms with stationary PET. The algorithm can be easily extended to other iterative reconstruction algorithms, such as maximum a priori (MAP) and ordered subset expectation maximization (OSEM). The WL-MLEM algorithm with wobbling PET may offer improvements in both sensitivity and resolution, the two most sought-after features in PET design. - Highlights: • This paper proposed WL-MLEM algorithm for PET and demonstrated its performance. • WL-MLEM algorithm effectively combined wobbling and line spread function based MLEM. • WL-MLEM provided improvements in the spatial resolution and the PET image quality. • WL-MLEM can be easily extended to the other iterative

  11. Calculate the maximum expected dose for technical radio physicists a cobalt machine

    International Nuclear Information System (INIS)

    Avila Avila, Rafael; Perez Velasquez, Reytel; Gonzalez Lapez, Nadia

    2009-01-01

    Considering the daily operations carried out by technicians Radiophysics Medical Service Department of Radiation Oncology Hospital V. General Teaching I. Lenin in the city of Holguin, during a working week (Between Monday and Friday) as an important element in calculating the maximum expected dose (MDE). From the exponential decay law which is subject the source activity, we propose corrections to the cumulative doses in the weekly period, leading to obtaining a formula which takes into a cumulative dose during working days and sees no dose accumulation of rest days (Saturday and Sunday). The estimate factor correction is made from a power series expansion convergent is truncated at the n-th term coincides with the week period for which you want to calculate the dose. As initial condition is adopted ambient dose equivalent rate as a given, which allows estimate MDE in the moments after or before this. Calculations were proposed use of an Excel spreadsheet that allows simple and accessible processing the formula obtained. (author)

  12. MADmap: A Massively Parallel Maximum-Likelihood Cosmic Microwave Background Map-Maker

    Energy Technology Data Exchange (ETDEWEB)

    Cantalupo, Christopher; Borrill, Julian; Jaffe, Andrew; Kisner, Theodore; Stompor, Radoslaw

    2009-06-09

    MADmap is a software application used to produce maximum-likelihood images of the sky from time-ordered data which include correlated noise, such as those gathered by Cosmic Microwave Background (CMB) experiments. It works efficiently on platforms ranging from small workstations to the most massively parallel supercomputers. Map-making is a critical step in the analysis of all CMB data sets, and the maximum-likelihood approach is the most accurate and widely applicable algorithm; however, it is a computationally challenging task. This challenge will only increase with the next generation of ground-based, balloon-borne and satellite CMB polarization experiments. The faintness of the B-mode signal that these experiments seek to measure requires them to gather enormous data sets. MADmap is already being run on up to O(1011) time samples, O(108) pixels and O(104) cores, with ongoing work to scale to the next generation of data sets and supercomputers. We describe MADmap's algorithm based around a preconditioned conjugate gradient solver, fast Fourier transforms and sparse matrix operations. We highlight MADmap's ability to address problems typically encountered in the analysis of realistic CMB data sets and describe its application to simulations of the Planck and EBEX experiments. The massively parallel and distributed implementation is detailed and scaling complexities are given for the resources required. MADmap is capable of analysing the largest data sets now being collected on computing resources currently available, and we argue that, given Moore's Law, MADmap will be capable of reducing the most massive projected data sets.

  13. Optical flare of HDE 245770-A0535+26 during the expected X-ray maximum

    International Nuclear Information System (INIS)

    Maslennikov, K.L.

    1986-01-01

    UBV-photometry of the optical component of the X-ray binary HD 245770-A0535+26 was carried out in April 12-18, 1985. The brightness increase (by 0sup(m).25 in the U band) was observed four days before an X-ray maximum of A0535+26 predicted from the 111-day period

  14. Confidence limits with multiple channels and arbitrary probability distributions for sensitivity and expected background

    CERN Document Server

    Perrotta, A

    2002-01-01

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).

  15. Self-reported maternal expectations and child-rearing practices : Disentangling the associations with ethnicity, immigration, and educational background

    NARCIS (Netherlands)

    Durgel, E.S.; van de Vijver, F.J.R.; Yagmurlu, B.

    2013-01-01

    This study aimed at: (1) disentangling the associations between ethnicity, immigration, educational background, and mothers’ developmental expectations and (self-reported) child-rearing practices; and (2) identifying the cross-cultural differences and similarities in developmental expectations and

  16. Simultaneous determination of exponential background and Gaussian peak functions in gamma ray scintillation spectrometers by maximum likelihood technique

    International Nuclear Information System (INIS)

    Eisler, P.; Youl, S.; Lwin, T.; Nelson, G.

    1983-01-01

    Simultaneous fitting of peaks and background functions from gamma-ray spectrometry using multichannel pulse height analysis is considered. The specific case of Gaussian peak and exponential background is treated in detail with respect to simultaneous estimation of both functions by using a technique which incorporates maximum likelihood method as well as a graphical method. Theoretical expressions for the standard errors of the estimates are also obtained. The technique is demonstrated for two experimental data sets. (orig.)

  17. Reconstruction of electrical impedance tomography (EIT) images based on the expectation maximum (EM) method.

    Science.gov (United States)

    Wang, Qi; Wang, Huaxiang; Cui, Ziqiang; Yang, Chengyi

    2012-11-01

    Electrical impedance tomography (EIT) calculates the internal conductivity distribution within a body using electrical contact measurements. The image reconstruction for EIT is an inverse problem, which is both non-linear and ill-posed. The traditional regularization method cannot avoid introducing negative values in the solution. The negativity of the solution produces artifacts in reconstructed images in presence of noise. A statistical method, namely, the expectation maximization (EM) method, is used to solve the inverse problem for EIT in this paper. The mathematical model of EIT is transformed to the non-negatively constrained likelihood minimization problem. The solution is obtained by the gradient projection-reduced Newton (GPRN) iteration method. This paper also discusses the strategies of choosing parameters. Simulation and experimental results indicate that the reconstructed images with higher quality can be obtained by the EM method, compared with the traditional Tikhonov and conjugate gradient (CG) methods, even with non-negative processing. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Media background and future expectations as well as regulations when facing digitalization in Spain

    Directory of Open Access Journals (Sweden)

    Lic. Núria Reguero i Jiménez; nuria.reguero@uab.cat

    2009-01-01

    Full Text Available The current article aims at describing and analyzing non profitable media background and future expectations as well as regulations when facing digitalization in Spain. The research is carried out from an inductive perspective which implies using the documentary analysis, consultations to several sources and, eventually, interviews. Additionally, this paper aims at contributing to reflect on media social role.El presente texto pretende describir y analizar la regulación de las radios y televisiones sin ánimo de lucro en España ante la digitalización, así como sus antecedentes y sus perspectivas de futuro. De igual forma también plantea un objetivo complementario: realizar un aporte a la reflexión sobre la función social que desempeñan estos medios en la sociedad. Con el propósito de describir el objeto de estudio, se ha realizado de manera inductiva una aproximación al mismo utilizando como técnicas de investigación el análisis documental, la consulta personal a diferentes fuentes y, de forma eventual, la entrevista. Como objetivo adicional, este texto pretende contribuir a la reflexión sobre la función social de los medios de proximidad.

  19. Expectations

    DEFF Research Database (Denmark)

    depend on the reader’s own experiences, individual feelings, personal associations or on conventions of reading, interpretive communities and cultural conditions? This volume brings together narrative theory, fictionality theory and speech act theory to address such questions of expectations...

  20. It is time to abandon "expected bladder capacity." Systematic review and new models for children's normal maximum voided volumes.

    Science.gov (United States)

    Martínez-García, Roberto; Ubeda-Sansano, Maria Isabel; Díez-Domingo, Javier; Pérez-Hoyos, Santiago; Gil-Salom, Manuel

    2014-09-01

    There is an agreement to use simple formulae (expected bladder capacity and other age based linear formulae) as bladder capacity benchmark. But real normal child's bladder capacity is unknown. To offer a systematic review of children's normal bladder capacity, to measure children's normal maximum voided volumes (MVVs), to construct models of MVVs and to compare them with the usual formulae. Computerized, manual and grey literature were reviewed until February 2013. Epidemiological, observational, transversal, multicenter study. A consecutive sample of healthy children aged 5-14 years, attending Primary Care centres with no urologic abnormality were selected. Participants filled-in a 3-day frequency-volume chart. Variables were MVVs: maximum of 24 hr, nocturnal, and daytime maximum voided volumes. diuresis and its daytime and nighttime fractions; body-measure data; and gender. The consecutive steps method was used in a multivariate regression model. Twelve articles accomplished systematic review's criteria. Five hundred and fourteen cases were analysed. Three models, one for each of the MVVs, were built. All of them were better adjusted to exponential equations. Diuresis (not age) was the most significant factor. There was poor agreement between MVVs and usual formulae. Nocturnal and daytime maximum voided volumes depend on several factors and are different. Nocturnal and daytime maximum voided volumes should be used with different meanings in clinical setting. Diuresis is the main factor for bladder capacity. This is the first model for benchmarking normal MVVs with diuresis as its main factor. Current formulae are not suitable for clinical use. © 2013 Wiley Periodicals, Inc.

  1. Using the IEC standard to describe low-background detectors -- What can you expect?

    International Nuclear Information System (INIS)

    Keyser, R.M.; Wagner, S.

    1998-01-01

    Many measurements for environmental levels of the radioactive content require that the gamma-ray detector be low background, that is, free of any radioactive content. This is, of course, not possible, but the radioactivity in the detector must be reduced to as low a value as possible. The description or specification of the background spectrum necessary to achieve the desired results is needed. The new International Electrotechnical Commission (IEC) standard for describing the background makes the specification of the background in a high-purity germanium (HPGe) detector simple, unambiguous, and related to how the detector will be used. Users and manufacturers will finally be speaking the same language on this subject. Because this standard extends the specification of the performance of an HPGe detector, there is little history available for comparison and thus no means of determining a good value. To develop a history, the background spectrum for 500 low-background HPGe ORTEC detectors were all counted in similar low-background shields. These detectors were in a variety of mechanical cryostat and endcap configurations. The continuum background is a function of energy and detector size/configuration. The peak area for the peak energies listed in the standard is a function of detector size and configuration. The results thus give practical guidance for obtaining the most appropriate low-background detector for a specific measurement problem

  2. Belief Shift or Only Facilitation: How Semantic Expectancy Affects Processing of Speech Degraded by Background Noise.

    Science.gov (United States)

    Simeon, Katherine M; Bicknell, Klinton; Grieco-Calub, Tina M

    2018-01-01

    Individuals use semantic expectancy - applying conceptual and linguistic knowledge to speech input - to improve the accuracy and speed of language comprehension. This study tested how adults use semantic expectancy in quiet and in the presence of speech-shaped broadband noise at -7 and -12 dB signal-to-noise ratio. Twenty-four adults (22.1 ± 3.6 years, mean ± SD ) were tested on a four-alternative-forced-choice task whereby they listened to sentences and were instructed to select an image matching the sentence-final word. The semantic expectancy of the sentences was unrelated to (neutral), congruent with, or conflicting with the acoustic target. Congruent expectancy improved accuracy and conflicting expectancy decreased accuracy relative to neutral, consistent with a theory where expectancy shifts beliefs toward likely words and away from unlikely words. Additionally, there were no significant interactions of expectancy and noise level when analyzed in log-odds, supporting the predictions of ideal observer models of speech perception.

  3. Using the IEC Standard to Describe Low-Background Detectors-What Can You Expect?

    International Nuclear Information System (INIS)

    Ronald M. Keyser; Sanford Wagner

    1998-01-01

    The new International Electrotechnical Commission (IEC) standard for describing the background makes the specification of the background in a high-purity germanium (HPGe) detector simple, unambiguous, and related to how the detector will be used. Users and manufacturers will finally be speaking the same language on this subject. Because this standard extends the specification of the performance of an HPGe detector, there is little history available for comparison and thus no means of determining a ''good'' value. To develop a history, the background spectrum for 500 low-background HPGe ORTEC detectors were all counted in similar low-background shields. These detectors were in a variety of mechanical cryostat and endcap configurations. The continuum background is a function of energy and detector size/configuration. The peak area for the peak energies listed in the standard is a function of detector size and configuration. The results thus give practical guidance for obtaining the most appropriate low-background detector for a specific measurement problem

  4. Direct reconstruction of the source intensity distribution of a clinical linear accelerator using a maximum likelihood expectation maximization algorithm.

    Science.gov (United States)

    Papaconstadopoulos, P; Levesque, I R; Maglieri, R; Seuntjens, J

    2016-02-07

    Direct determination of the source intensity distribution of clinical linear accelerators is still a challenging problem for small field beam modeling. Current techniques most often involve special equipment and are difficult to implement in the clinic. In this work we present a maximum-likelihood expectation-maximization (MLEM) approach to the source reconstruction problem utilizing small fields and a simple experimental set-up. The MLEM algorithm iteratively ray-traces photons from the source plane to the exit plane and extracts corrections based on photon fluence profile measurements. The photon fluence profiles were determined by dose profile film measurements in air using a high density thin foil as build-up material and an appropriate point spread function (PSF). The effect of other beam parameters and scatter sources was minimized by using the smallest field size ([Formula: see text] cm(2)). The source occlusion effect was reproduced by estimating the position of the collimating jaws during this process. The method was first benchmarked against simulations for a range of typical accelerator source sizes. The sources were reconstructed with an accuracy better than 0.12 mm in the full width at half maximum (FWHM) to the respective electron sources incident on the target. The estimated jaw positions agreed within 0.2 mm with the expected values. The reconstruction technique was also tested against measurements on a Varian Novalis Tx linear accelerator and compared to a previously commissioned Monte Carlo model. The reconstructed FWHM of the source agreed within 0.03 mm and 0.11 mm to the commissioned electron source in the crossplane and inplane orientations respectively. The impact of the jaw positioning, experimental and PSF uncertainties on the reconstructed source distribution was evaluated with the former presenting the dominant effect.

  5. Self-reported maternal expectations and child-rearing practices: Disentangling the associations with ethnicity, immigration, and educational background

    OpenAIRE

    Durgel, Elif S.; Van de Vijver, Fons J.R.; Yagmurlu, Bilge

    2013-01-01

    This study aimed at: (1) disentangling the associations between ethnicity, immigration, educational background, and mothers’ developmental expectations and (self-reported) child-rearing practices; and (2) identifying the cross-cultural differences and similarities in developmental expectations and child-rearing practices. Participants were 111 Dutch and 111 Turkish immigrant mothers in the Netherlands, and 242 Turkish mothers living in Turkey. Dutch and higher-educated mothers had a ...

  6. Family Background, Students' Academic Self-Efficacy, and Students' Career and Life Success Expectations

    Science.gov (United States)

    Kim, Mihyeon

    2014-01-01

    This study examined the relationship of family background on students' academic self-efficacy and the impact of students' self-efficacy on their career and life success expectations. The study used the national dataset of the Educational Longitudinal Study of 2002 (ELS: 2002), funded by the U.S. Department of Education. Based on a path…

  7. Extragalactic Background Light expected from photon-photon absorption on spectra of distant Active Galactic Nuclei

    International Nuclear Information System (INIS)

    Sinitsyna, V. G.; Sinitsyna, V. Y.

    2013-01-01

    Extragalactic background radiation blocks the propagation of TeV gamma-ray over large distances by producing e + e - pairs. As a result, primary spectrum of gamma-source is changed, depending on spectrum of background light. So, hard spectra of Active Galactic Nuclei with high red shifts allow the determination of a EBL spectrum. The redshifts of SHALON TeV gamma-ray sources range from 0.018 to 1.375 those spectra are resolved at the energies from 800 GeV to 30 TeV. Spectral energy distribution of EBL constrained from observations of Mkn421 (z=0.031), Mkn501 (z=0.034), Mkn180 (z=0.046), OJ287 (z=0.306), 3c454.3 (z=0.859) and 1739+5220(z=1.375) together with models and measurements are presented. (authors)

  8. Maximum Expected Wall Heat Flux and Maximum Pressure After Sudden Loss of Vacuum Insulation on the Stratospheric Observatory for Infrared Astronomy (SOFIA) Liquid Helium (LHe) Dewars

    Science.gov (United States)

    Ungar, Eugene K.

    2014-01-01

    The aircraft-based Stratospheric Observatory for Infrared Astronomy (SOFIA) is a platform for multiple infrared observation experiments. The experiments carry sensors cooled to liquid helium (LHe) temperatures. A question arose regarding the heat input and peak pressure that would result from a sudden loss of the dewar vacuum insulation. Owing to concerns about the adequacy of dewar pressure relief in the event of a sudden loss of the dewar vacuum insulation, the SOFIA Program engaged the NASA Engineering and Safety Center (NESC). This report summarizes and assesses the experiments that have been performed to measure the heat flux into LHe dewars following a sudden vacuum insulation failure, describes the physical limits of heat input to the dewar, and provides an NESC recommendation for the wall heat flux that should be used to assess the sudden loss of vacuum insulation case. This report also assesses the methodology used by the SOFIA Program to predict the maximum pressure that would occur following a loss of vacuum event.

  9. Can diligent and extensive mapping of faults provide reliable estimates of the expected maximum earthquakes at these faults? No. (Invited)

    Science.gov (United States)

    Bird, P.

    2010-12-01

    The hope expressed in the title question above can be contradicted in 5 ways, listed below. To summarize, an earthquake rupture can be larger than anticipated either because the fault system has not been fully mapped, or because the rupture is not limited to the pre-existing fault network. 1. Geologic mapping of faults is always incomplete due to four limitations: (a) Map-scale limitation: Faults below a certain (scale-dependent) apparent offset are omitted; (b) Field-time limitation: The most obvious fault(s) get(s) the most attention; (c) Outcrop limitation: You can't map what you can't see; and (d) Lithologic-contrast limitation: Intra-formation faults can be tough to map, so they are often assumed to be minor and omitted. If mapping is incomplete, fault traces may be longer and/or better-connected than we realize. 2. Fault trace “lengths” are unreliable guides to maximum magnitude. Fault networks have multiply-branching, quasi-fractal shapes, so fault “length” may be meaningless. Naming conventions for main strands are unclear, and rarely reviewed. Gaps due to Quaternary alluvial cover may not reflect deeper seismogenic structure. Mapped kinks and other “segment boundary asperities” may be only shallow structures. Also, some recent earthquakes have jumped and linked “separate” faults (Landers, California 1992; Denali, Alaska, 2002) [Wesnousky, 2006; Black, 2008]. 3. Distributed faulting (“eventually occurring everywhere”) is predicted by several simple theories: (a) Viscoelastic stress redistribution in plate/microplate interiors concentrates deviatoric stress upward until they fail by faulting; (b) Unstable triple-junctions (e.g., between 3 strike-slip faults) in 2-D plate theory require new faults to form; and (c) Faults which appear to end (on a geologic map) imply distributed permanent deformation. This means that all fault networks evolve and that even a perfect fault map would be incomplete for future ruptures. 4. A recent attempt

  10. Parent and Staff Expectations for Continuity of Home Practices in the Child Care Setting for Families with Diverse Cultural Backgrounds

    Science.gov (United States)

    De Gioia, Katey

    2009-01-01

    The use of childcare services for very young children (birth to three years) has increased dramatically in the past two decades (Department of Families, Community Services and Indigenous Affairs, 2004). This article investigates the expectations for cultural continuity of caregiving practices (with particular emphasis on sleep and feeding) between…

  11. Vacuum expectation value of the stress tensor in an arbitrary curved background: The covariant point-separation method

    International Nuclear Information System (INIS)

    Christensen, S.M.

    1976-01-01

    A method known as covariant geodesic point separation is developed to calculate the vacuum expectation value of the stress tensor for a massive scalar field in an arbitrary gravitational field. The vacuum expectation value will diverge because the stress-tensor operator is constructed from products of field operators evaluated at the same space-time point. To remedy this problem, one of the field operators is taken to a nearby point. The resultant vacuum expectation value is finite and may be expressed in terms of the Hadamard elementary function. This function is calculated using a curved-space generalization of Schwinger's proper-time method for calculating the Feynman Green's function. The expression for the Hadamard function is written in terms of the biscalar of geodetic interval which gives a measure of the square of the geodesic distance between the separated points. Next, using a covariant expansion in terms of the tangent to the geodesic, the stress tensor may be expanded in powers of the length of the geodesic. Covariant expressions for each divergent term and for certain terms in the finite portion of the vacuum expectation value of the stress tensor are found. The properties, uses, and limitations of the results are discussed

  12. Best Practice Life Expectancy

    DEFF Research Database (Denmark)

    Medford, Anthony

    2017-01-01

    been reported previously by various authors. Though remarkable, this is simply an empirical observation. Objective: We examine best-practice life expectancy more formally by using extreme value theory. Methods: Extreme value distributions are fit to the time series (1900 to 2012) of maximum life......Background: Whereas the rise in human life expectancy has been extensively studied, the evolution of maximum life expectancies, i.e., the rise in best-practice life expectancy in a group of populations, has not been examined to the same extent. The linear rise in best-practice life expectancy has...... expectancies at birth and age 65, for both sexes, using data from the Human Mortality Database and the United Nations. Conclusions: Generalized extreme value distributions offer a theoretically justified way to model best-practice life expectancies. Using this framework one can straightforwardly obtain...

  13. Evaluation of tomographic image quality of extended and conventional parallel hole collimators using maximum likelihood expectation maximization algorithm by Monte Carlo simulations.

    Science.gov (United States)

    Moslemi, Vahid; Ashoor, Mansour

    2017-10-01

    One of the major problems associated with parallel hole collimators (PCs) is the trade-off between their resolution and sensitivity. To solve this problem, a novel PC - namely, extended parallel hole collimator (EPC) - was proposed, in which particular trapezoidal denticles were increased upon septa on the side of the detector. In this study, an EPC was designed and its performance was compared with that of two PCs, PC35 and PC41, with a hole size of 1.5 mm and hole lengths of 35 and 41 mm, respectively. The Monte Carlo method was used to calculate the important parameters such as resolution, sensitivity, scattering, and penetration ratio. A Jaszczak phantom was also simulated to evaluate the resolution and contrast of tomographic images, which were produced by the EPC6, PC35, and PC41 using the Monte Carlo N-particle version 5 code, and tomographic images were reconstructed by using maximum likelihood expectation maximization algorithm. Sensitivity of the EPC6 was increased by 20.3% in comparison with that of the PC41 at the identical spatial resolution and full-width at tenth of maximum here. Moreover, the penetration and scattering ratio of the EPC6 was 1.2% less than that of the PC41. The simulated phantom images show that the EPC6 increases contrast-resolution and contrast-to-noise ratio compared with those of PC41 and PC35. When compared with PC41 and PC35, EPC6 improved trade-off between resolution and sensitivity, reduced penetrating and scattering ratios, and produced images with higher quality. EPC6 can be used to increase detectability of more details in nuclear medicine images.

  14. STUDY LINKS SOLVING THE MAXIMUM TASK OF LINEAR CONVOLUTION «EXPECTED RETURNS-VARIANCE» AND THE MINIMUM VARIANCE WITH RESTRICTIONS ON RETURNS

    Directory of Open Access Journals (Sweden)

    Maria S. Prokhorova

    2014-01-01

    Full Text Available The article deals with a study of problemsof finding the optimal portfolio securitiesusing convolutions expectation of portfolioreturns and portfolio variance. Value of thecoefficient of risk, in which the problem ofmaximizing the variance - limited yieldis equivalent to maximizing a linear convolution of criteria for «expected returns-variance» is obtained. An automated method for finding the optimal portfolio, onthe basis of which the results of the studydemonstrated is proposed.

  15. Experiments expectations

    OpenAIRE

    Gorini, B; Meschi, E

    2014-01-01

    This paper presents the expectations and the constraints of the experiments relatively to the commissioning procedure and the running conditions for the 2015 data taking period. The views about the various beam parameters for the p-p period, like beam energy, maximum pileup, bunch spacing and luminosity limitation in IP2 and IP8, are discussed. The goals and the constraints of the 2015 physics program are also presented, including the heavy ions period as well as the special...

  16. Best Practice Life Expectancy:An Extreme value Approach

    OpenAIRE

    Medford, Anthony

    2017-01-01

    Background: Whereas the rise in human life expectancy has been extensively studied, the evolution of maximum life expectancies, i.e., the rise in best-practice life expectancy in a group of populations, has not been examined to the same extent. The linear rise in best-practice life expectancy has been reported previously by various authors. Though remarkable, this is simply an empirical observation. Objective: We examine best-practice life expectancy more formally by using extreme value th...

  17. Extragalactic Background Light expected from photon-photon absorption on spectra of Active Galactic Nuclei at distances from z=0.018 to z=1.375

    International Nuclear Information System (INIS)

    Sinitsyna, V Y; Sinitsyna, V G

    2013-01-01

    Extragalactic background radiation blocks the propagation of TeV gamma-ray over large distances by producing e + e − pairs. As a result, primary spectrum of gamma-source is changed, depending on spectrum of background light. So, hard spectra of Active Galactic Nuclei with high red shifts allow the determination of a EBL spectrum. The redshifts of SHALON TeV gamma-ray sources range from 0.018 to 1.375 those spectra are resolved at the energies from 800 GeV to about 50 TeV. Spectral energy distribution of EBL constrained from observations of Mkn421, Mkn501, Mkn180, OJ287, 3c454.3 and 1739+522 together with models and measurements are presented.

  18. Determinants of use and non-use of a web-based communication system in cerebral palsy care: evaluating the association between professionals' system use and their a priori expectancies and background

    Directory of Open Access Journals (Sweden)

    van Harten Wim H

    2011-06-01

    Full Text Available Abstract Background Previously we described parents' and professionals' experiences with a web-based communication system in a 6-month pilot in three Dutch cerebral palsy care settings. We found that half of the participating professionals had not used the system, and of those who had used the system one third had used it only once. The present study aimed to evaluate whether professionals' system use was associated with their a priori expectancies and background. Methods Professionals who had not used the system (n = 54 were compared with professionals who had used the system more than once (n = 46 on the basis of their questionnaire responses before the pilot, their affiliation and the number of patients which they represented in the study. The questionnaire items comprised professionals' expectancies regarding the system's performance and ease of use, as well as the expected time availability and integration into daily care practice. Results Overall, users had higher a priori expectancies than non-users. System use was associated with expected ease of use (p = .046 and time availability (p = .005: 50% of the users (vs. 31% of the non-users expected that the system would be easy to use and 93% of the users (vs. 72% of the non-users expected that they would be able to reserve a time slot each week for responding to submitted questions. With respect to professionals' affiliation, system use was associated with professionals' institution (p = .003 and discipline (p = .001, with more (para- medical professionals among users (93% vs. 63% among non-users, and more education professionals among non-users (37% vs. 7% among users. In addition, users represented more patients (mean 2, range 1-8 than non-users (mean 1.1, range 1-2 (p = .000. Conclusions Professionals' system use was associated with expected ease of use and time availability, professionals' affiliation and the number of represented patients, while no association was found with expected

  19. Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins.

    Science.gov (United States)

    Säwén, Elin; Massad, Tariq; Landersjö, Clas; Damberg, Peter; Widmalm, Göran

    2010-08-21

    The conformational space available to the flexible molecule α-D-Manp-(1-->2)-α-D-Manp-OMe, a model for the α-(1-->2)-linked mannose disaccharide in N- or O-linked glycoproteins, is determined using experimental data and molecular simulation combined with a maximum entropy approach that leads to a converged population distribution utilizing different input information. A database survey of the Protein Data Bank where structures having the constituent disaccharide were retrieved resulted in an ensemble with >200 structures. Subsequent filtering removed erroneous structures and gave the database (DB) ensemble having three classes of mannose-containing compounds, viz., N- and O-linked structures, and ligands to proteins. A molecular dynamics (MD) simulation of the disaccharide revealed a two-state equilibrium with a major and a minor conformational state, i.e., the MD ensemble. These two different conformation ensembles of the disaccharide were compared to measured experimental spectroscopic data for the molecule in water solution. However, neither of the two populations were compatible with experimental data from optical rotation, NMR (1)H,(1)H cross-relaxation rates as well as homo- and heteronuclear (3)J couplings. The conformational distributions were subsequently used as background information to generate priors that were used in a maximum entropy analysis. The resulting posteriors, i.e., the population distributions after the application of the maximum entropy analysis, still showed notable deviations that were not anticipated based on the prior information. Therefore, reparameterization of homo- and heteronuclear Karplus relationships for the glycosidic torsion angles Φ and Ψ were carried out in which the importance of electronegative substituents on the coupling pathway was deemed essential resulting in four derived equations, two (3)J(COCC) and two (3)J(COCH) being different for the Φ and Ψ torsions, respectively. These Karplus relationships are denoted

  20. Best-practice life expectancy: An extreme value approach

    Directory of Open Access Journals (Sweden)

    Anthony Medford

    2017-03-01

    Full Text Available Background: Whereas the rise in human life expectancy has been extensively studied, the evolution of maximum life expectancies, i.e., the rise in best-practice life expectancy in a group of populations, has not been examined to the same extent. The linear rise in best-practice life expectancy has been reported previously by various authors. Though remarkable, this is simply an empirical observation. Objective: We examine best-practice life expectancy more formally by using extreme value theory. Methods: Extreme value distributions are fit to the time series (1900 to 2012 of maximum life expectancies at birth and age 65, for both sexes, using data from the Human Mortality Database and the United Nations. Conclusions: Generalized extreme value distributions offer a theoretically justified way to model best-practice life expectancies. Using this framework one can straightforwardly obtain probability estimates of best-practice life expectancy levels or make projections about future maximum life expectancy. Comments: Our findings may be useful for policymakers and insurance/pension analysts who would like to obtain estimates and probabilities of future maximum life expectancies.

  1. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    , they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  2. Unequal Expectations

    DEFF Research Database (Denmark)

    Karlson, Kristian Bernt

    the role of causal inference in social science; and it discusses the potential of the findings of the dissertation to inform educational policy. In Chapters II and III, constituting the substantive contribution of the dissertation, I examine the process through which students form expectations...... of the relation between the self and educational prospects; evaluations that are socially bounded in that students take their family's social position into consideration when forming their educational expectations. One important consequence of this learning process is that equally talented students tend to make...... for their educational futures. Focusing on the causes rather than the consequences of educational expectations, I argue that students shape their expectations in response to the signals about their academic performance they receive from institutionalized performance indicators in schools. Chapter II considers...

  3. Great Expectations

    NARCIS (Netherlands)

    Dickens, Charles

    2005-01-01

    One of Dickens's most renowned and enjoyable novels, Great Expectations tells the story of Pip, an orphan boy who wishes to transcend his humble origins and finds himself unexpectedly given the opportunity to live a life of wealth and respectability. Over the course of the tale, in which Pip

  4. Background Material

    DEFF Research Database (Denmark)

    Zandersen, Marianne; Hyytiäinen, Kari; Saraiva, Sofia

    This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders.......This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders....

  5. Community expectations

    International Nuclear Information System (INIS)

    Kraemer, L.

    2004-01-01

    Historically, the relationship between the nuclear generator and the local community has been one of stability and co-operation. However in more recent times (2000-2003) the nuclear landscape has had several major issues that directly effect the local nuclear host communities. - The associations mandate is to be supportive of the nuclear industry through ongoing dialogue, mutual cooperation and education, - To strengthen community representation with the nuclear industry and politically through networking with other nuclear host communities. As a result of these issues, the Mayors of a number of communities started having informal meetings to discuss the issues at hand and how they effect their constituents. These meetings led to the official formation of the CANHC with representation from: In Canada it is almost impossible to discuss decommissioning and dismantling of Nuclear Facilities without also discussing Nuclear Waste disposal for reasons that I will soon make clear. Also I would like to briefly touch on how and why expectation of communities may differ by geography and circumstance. (author)

  6. The Cosmic Infrared Background Experiment

    Science.gov (United States)

    Bock, James; Battle, J.; Cooray, A.; Hristov, V.; Kawada, M.; Keating, B.; Lee, D.; Matsumoto, T.; Matsuura, S.; Nam, U.; Renbarger, T.; Sullivan, I.; Tsumura, K.; Wada, T.; Zemcov, M.

    2009-01-01

    We are developing the Cosmic Infrared Background ExpeRiment (CIBER) to search for signatures of first-light galaxy emission in the extragalactic background. The first generation of stars produce characteristic signatures in the near-infrared extragalactic background, including a redshifted Ly-cutoff feature and a characteristic fluctuation power spectrum, that may be detectable with a specialized instrument. CIBER consists of two wide-field cameras to measure the fluctuation power spectrum, and a low-resolution and a narrow-band spectrometer to measure the absolute background. The cameras will search for fluctuations on angular scales from 7 arcseconds to 2 degrees, where the first-light galaxy spatial power spectrum peaks. The cameras have the necessary combination of sensitivity, wide field of view, spatial resolution, and multiple bands to make a definitive measurement. CIBER will determine if the fluctuations reported by Spitzer arise from first-light galaxies. The cameras observe in a single wide field of view, eliminating systematic errors associated with mosaicing. Two bands are chosen to maximize the first-light signal contrast, at 1.6 um near the expected spectral maximum, and at 1.0 um; the combination is a powerful discriminant against fluctuations arising from local sources. We will observe regions of the sky surveyed by Spitzer and Akari. The low-resolution spectrometer will search for the redshifted Lyman cutoff feature in the 0.7 - 1.8 um spectral region. The narrow-band spectrometer will measure the absolute Zodiacal brightness using the scattered 854.2 nm Ca II Fraunhofer line. The spectrometers will test if reports of a diffuse extragalactic background in the 1 - 2 um band continues into the optical, or is caused by an under estimation of the Zodiacal foreground. We report performance of the assembled and tested instrument as we prepare for a first sounding rocket flight in early 2009. CIBER is funded by the NASA/APRA sub-orbital program.

  7. Modeling predicted that tobacco control policies targeted at lower educated will reduce the differences in life expectancy

    NARCIS (Netherlands)

    Bemelmans, W.J.E.; Lenthe, F. van; Hoogenveen, R.; Kunst, A.; Deeg, D.J.H.; Brandt, P.A. van den; Goldbohm, R.A.; Verschuren, W.M.M.

    2006-01-01

    Background and Objective: To estimate the effects of reducing the prevalence of smoking in lower educated groups on educational differences in life expectancy. Methods: A dynamic Markov-type multistate transition model estimated the effects on life expectancy of two scenarios. A "maximum scenario"

  8. Background radiation

    International Nuclear Information System (INIS)

    Arnott, D.

    1985-01-01

    The effects of background radiation, whether natural or caused by man's activities, are discussed. The known biological effects of radiation in causing cancers or genetic mutations are explained. The statement that there is a threshold below which there is no risk is examined critically. (U.K.)

  9. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  10. Maximum permissible dose

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This chapter presents a historic overview of the establishment of radiation guidelines by various national and international agencies. The use of maximum permissible dose and maximum permissible body burden limits to derive working standards is discussed

  11. Spontaneous Radiation Background Calculation for LCLS

    CERN Document Server

    Reiche, Sven

    2004-01-01

    The intensity of undulator radiation, not amplified by the FEL interaction, can be larger than the maximum FEL signal in the case of an X-ray FEL. In the commissioning of a SASE FEL it is essential to extract an amplified signal early to diagnose eventual misalignment of undulator modules or errors in the undulator field strength. We developed a numerical code to calculate the radiation pattern at any position behind a multi-segmented undulator with arbitrary spacing and field profiles. The output can be run through numerical spatial and frequency filters to model the radiation beam transport and diagnostic. In this presentation we estimate the expected background signal for the FEL diagnostic and at what point along the undulator the FEL signal can be separated from the background. We also discusses how much information on the undulator field and alignment can be obtained from the incoherent radiation signal itself.

  12. The cosmic microwave background

    International Nuclear Information System (INIS)

    Silk, J.

    1991-01-01

    Recent limits on spectral distortions and angular anisotropies in the cosmic microwave background are reviewed. The various backgrounds are described, and the theoretical implications are assessed. Constraints on inflationary cosmology dominated by cold dark matter (CDM) and on open cosmological models dominated by baryonic dark matter (BDM), with, respectively, primordial random phase scale-invariant curvature fluctuations or non-gaussian isocurvature fluctuations are described. More exotic theories are addressed, and I conclude with the 'bottom line': what theories expect experimentalists to be measuring within the next two to three years without having to abandon their most cherished theorists. (orig.)

  13. Backgrounds and characteristics of arsonists

    NARCIS (Netherlands)

    Labree, W.; Nijman, H.L.I.; Marle, H.J.C. van; Rassin, E.

    2010-01-01

    The aim of this study was to gain more insight in the backgrounds and characteristics of arsonists. For this, the psychiatric, psychological, personal, and criminal backgrounds of all arsonists (n = 25), sentenced to forced treatment in the maximum security forensic hospital “De Kijvelanden”, were

  14. Maximum Acceleration Recording Circuit

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1995-01-01

    Coarsely digitized maximum levels recorded in blown fuses. Circuit feeds power to accelerometer and makes nonvolatile record of maximum level to which output of accelerometer rises during measurement interval. In comparison with inertia-type single-preset-trip-point mechanical maximum-acceleration-recording devices, circuit weighs less, occupies less space, and records accelerations within narrower bands of uncertainty. In comparison with prior electronic data-acquisition systems designed for same purpose, circuit simpler, less bulky, consumes less power, costs and analysis of data recorded in magnetic or electronic memory devices. Circuit used, for example, to record accelerations to which commodities subjected during transportation on trucks.

  15. Maximum Quantum Entropy Method

    OpenAIRE

    Sim, Jae-Hoon; Han, Myung Joon

    2018-01-01

    Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...

  16. Maximum power demand cost

    International Nuclear Information System (INIS)

    Biondi, L.

    1998-01-01

    The charging for a service is a supplier's remuneration for the expenses incurred in providing it. There are currently two charges for electricity: consumption and maximum demand. While no problem arises about the former, the issue is more complicated for the latter and the analysis in this article tends to show that the annual charge for maximum demand arbitrarily discriminates among consumer groups, to the disadvantage of some [it

  17. Maximum entropy analysis of EGRET data

    DEFF Research Database (Denmark)

    Pohl, M.; Strong, A.W.

    1997-01-01

    EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....

  18. The Cosmic Microwave Background

    Directory of Open Access Journals (Sweden)

    Jones Aled

    1998-01-01

    Full Text Available We present a brief review of current theory and observations of the cosmic microwave background (CMB. New predictions for cosmological defect theories and an overview of the inflationary theory are discussed. Recent results from various observations of the anisotropies of the microwave background are described and a summary of the proposed experiments is presented. A new analysis technique based on Bayesian statistics that can be used to reconstruct the underlying sky fluctuations is summarised. Current CMB data is used to set some preliminary constraints on the values of fundamental cosmological parameters $Omega$ and $H_circ$ using the maximum likelihood technique. In addition, secondary anisotropies due to the Sunyaev-Zel'dovich effect are described.

  19. Maximum likely scale estimation

    DEFF Research Database (Denmark)

    Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo

    2005-01-01

    A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...

  20. Robust Maximum Association Estimators

    NARCIS (Netherlands)

    A. Alfons (Andreas); C. Croux (Christophe); P. Filzmoser (Peter)

    2017-01-01

    textabstractThe maximum association between two multivariate variables X and Y is defined as the maximal value that a bivariate association measure between one-dimensional projections αX and αY can attain. Taking the Pearson correlation as projection index results in the first canonical correlation

  1. Maximum power point tracking

    International Nuclear Information System (INIS)

    Enslin, J.H.R.

    1990-01-01

    A well engineered renewable remote energy system, utilizing the principal of Maximum Power Point Tracking can be m ore cost effective, has a higher reliability and can improve the quality of life in remote areas. This paper reports that a high-efficient power electronic converter, for converting the output voltage of a solar panel, or wind generator, to the required DC battery bus voltage has been realized. The converter is controlled to track the maximum power point of the input source under varying input and output parameters. Maximum power point tracking for relative small systems is achieved by maximization of the output current in a battery charging regulator, using an optimized hill-climbing, inexpensive microprocessor based algorithm. Through practical field measurements it is shown that a minimum input source saving of 15% on 3-5 kWh/day systems can easily be achieved. A total cost saving of at least 10-15% on the capital cost of these systems are achievable for relative small rating Remote Area Power Supply systems. The advantages at larger temperature variations and larger power rated systems are much higher. Other advantages include optimal sizing and system monitor and control

  2. Maximum entropy methods

    International Nuclear Information System (INIS)

    Ponman, T.J.

    1984-01-01

    For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)

  3. The last glacial maximum

    Science.gov (United States)

    Clark, P.U.; Dyke, A.S.; Shakun, J.D.; Carlson, A.E.; Clark, J.; Wohlfarth, B.; Mitrovica, J.X.; Hostetler, S.W.; McCabe, A.M.

    2009-01-01

    We used 5704 14C, 10Be, and 3He ages that span the interval from 10,000 to 50,000 years ago (10 to 50 ka) to constrain the timing of the Last Glacial Maximum (LGM) in terms of global ice-sheet and mountain-glacier extent. Growth of the ice sheets to their maximum positions occurred between 33.0 and 26.5 ka in response to climate forcing from decreases in northern summer insolation, tropical Pacific sea surface temperatures, and atmospheric CO2. Nearly all ice sheets were at their LGM positions from 26.5 ka to 19 to 20 ka, corresponding to minima in these forcings. The onset of Northern Hemisphere deglaciation 19 to 20 ka was induced by an increase in northern summer insolation, providing the source for an abrupt rise in sea level. The onset of deglaciation of the West Antarctic Ice Sheet occurred between 14 and 15 ka, consistent with evidence that this was the primary source for an abrupt rise in sea level ???14.5 ka.

  4. Resolution and Efficiency of the ATLAS Muon Drift-Tube Chambers at High Background Rates

    CERN Document Server

    Deile, M.; Horvat, S.; Kortner, O.; Kroha, H.; Manz, A.; Mohrdieck-Mock, S.; Rauscher, F.; Richter, Robert; Staude, A.; Stiller, W.

    2016-01-01

    The resolution and efficiency of a precision drift-tube chamber for the ATLAS muon spectrometer with final read-out electronics was tested at the Gamma Irradiation Facility at CERN in a 100 GeV muon beam and at photon irradiation rates of up to 990 Hz/square cm which corresponds to twice the highest background rate expected in ATLAS. A silicon strip detector telescope was used as external reference in the beam. The pulse-height measurement of the read-out electronics was used to perform time-slewing corrections which lead to an improvement of the average drift-tube resolution from 104 microns to 82 microns without irradiation and from 128 microns to 108 microns at the maximum expected rate. The measured drift-tube efficiency agrees with the expectation from the dead time of the read-out electronics up to the maximum expected rate.

  5. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  6. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  7. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1988-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. We review the need for such methods in data analysis and show, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. We conclude with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  8. Solar maximum observatory

    International Nuclear Information System (INIS)

    Rust, D.M.

    1984-01-01

    The successful retrieval and repair of the Solar Maximum Mission (SMM) satellite by Shuttle astronauts in April 1984 permitted continuance of solar flare observations that began in 1980. The SMM carries a soft X ray polychromator, gamma ray, UV and hard X ray imaging spectrometers, a coronagraph/polarimeter and particle counters. The data gathered thus far indicated that electrical potentials of 25 MeV develop in flares within 2 sec of onset. X ray data show that flares are composed of compressed magnetic loops that have come too close together. Other data have been taken on mass ejection, impacts of electron beams and conduction fronts with the chromosphere and changes in the solar radiant flux due to sunspots. 13 references

  9. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1989-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. The author reviews the need for such methods in data analysis and shows, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. He concludes with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  10. Functional Maximum Autocorrelation Factors

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Nielsen, Allan Aasbjerg

    2005-01-01

    MAF outperforms the functional PCA in concentrating the interesting' spectra/shape variation in one end of the eigenvalue spectrum and allows for easier interpretation of effects. Conclusions. Functional MAF analysis is a useful methods for extracting low dimensional models of temporally or spatially......Purpose. We aim at data where samples of an underlying function are observed in a spatial or temporal layout. Examples of underlying functions are reflectance spectra and biological shapes. We apply functional models based on smoothing splines and generalize the functional PCA in......\\verb+~+\\$\\backslash\\$cite{ramsay97} to functional maximum autocorrelation factors (MAF)\\verb+~+\\$\\backslash\\$cite{switzer85,larsen2001d}. We apply the method to biological shapes as well as reflectance spectra. {\\$\\backslash\\$bf Methods}. MAF seeks linear combination of the original variables that maximize autocorrelation between...

  11. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yunji; Jing, Bing-Yi; Gao, Xin

    2015-01-01

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  12. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  13. Maximum Credible Incidents

    CERN Document Server

    Strait, J

    2009-01-01

    Following the incident in sector 34, considerable effort has been made to improve the systems for detecting similar faults and to improve the safety systems to limit the damage if a similar incident should occur. Nevertheless, even after the consolidation and repairs are completed, other faults may still occur in the superconducting magnet systems, which could result in damage to the LHC. Such faults include both direct failures of a particular component or system, or an incorrect response to a “normal” upset condition, for example a quench. I will review a range of faults which could be reasonably expected to occur in the superconducting magnet systems, and which could result in substantial damage and down-time to the LHC. I will evaluate the probability and the consequences of such faults, and suggest what mitigations, if any, are possible to protect against each.

  14. Solar maximum mission

    International Nuclear Information System (INIS)

    Ryan, J.

    1981-01-01

    By understanding the sun, astrophysicists hope to expand this knowledge to understanding other stars. To study the sun, NASA launched a satellite on February 14, 1980. The project is named the Solar Maximum Mission (SMM). The satellite conducted detailed observations of the sun in collaboration with other satellites and ground-based optical and radio observations until its failure 10 months into the mission. The main objective of the SMM was to investigate one aspect of solar activity: solar flares. A brief description of the flare mechanism is given. The SMM satellite was valuable in providing information on where and how a solar flare occurs. A sequence of photographs of a solar flare taken from SMM satellite shows how a solar flare develops in a particular layer of the solar atmosphere. Two flares especially suitable for detailed observations by a joint effort occurred on April 30 and May 21 of 1980. These flares and observations of the flares are discussed. Also discussed are significant discoveries made by individual experiments

  15. Expected years ever married

    Directory of Open Access Journals (Sweden)

    Ryohei Mogi

    2018-04-01

    Full Text Available Background: In the second half of the 20th century, remarkable marriage changes were seen: a great proportion of never married population, high average age at first marriage, and large variance in first marriage timing. Although it is theoretically possible to separate these three elements, disentangling them analytically remains a challenge. Objective: This study's goal is to answer the following questions: Which of the three effects, nonmarriage, delayed marriage, or expansion, has the most impact on nuptiality changes? How does the most influential factor differ by time periods, birth cohorts, and countries? Methods: To quantify nuptiality changes over time, we define the measure 'expected years ever married' (EYEM. We illustrate the use of EYEM, looking at time trends in 15 countries (six countries for cohort analysis and decompose these trends into three components: scale (the changes in the proportion of never married - nonmarriage, location (the changes in timing of first marriage - delayed marriage, and variance (the changes in the standard deviation of first marriage age - expansion. We used population counts by sex, age, and marital status from national statistical offices and the United Nations database. Results: Results show that delayed marriage is the most influential factor on period EYEM's changes, while nonmarriage has recently begun to contribute to the change in North and West Europe and Canada. Period and cohort analysis complement each other. Conclusions: This study introduces a new index of nuptiality and decomposes its change into the contribution of three components: scale, location, and variance. The decomposition steps presented here offer an open possibility for more elaborate parametric marriage models.

  16. Expecting the unexpected

    DEFF Research Database (Denmark)

    Mcneill, Ilona M.; Dunlop, Patrick D.; Heath, Jonathan B.

    2013-01-01

    People who live in wildfire-prone communities tend to form their own hazard-related expectations, which may influence their willingness to prepare for a fire. Past research has already identified two important expectancy-based factors associated with people's intentions to prepare for a natural......) and measured actual rather than intended preparedness. In addition, we tested the relation between preparedness and two additional threat-related expectations: the expectation that one can rely on an official warning and the expectation of encountering obstacles (e.g., the loss of utilities) during a fire...

  17. Determining health expectancies

    National Research Council Canada - National Science Library

    Robine, Jean-Marie

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Jean-Marie Robine 9 1 Increase in Life Expectancy and Concentration of Ages at Death . . . . France Mesle´ and Jacques Vallin 13 2 Compression of Morbidity...

  18. Performance appraisal of expectations

    Directory of Open Access Journals (Sweden)

    Russkikh G.A.

    2016-11-01

    Full Text Available this article provides basic concepts for teachers to estimate and reach planned students’ expectations, describes functions and elements of expectations; nature of external and internal estimate, technology to estimate the results, gives recommendations how to create diagnostic assignments.

  19. Spiking the expectancy profiles

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Loui, Psyche; Vuust, Peter

    Melodic expectations are generated with different degrees of certainty. Given distributions of expectedness ratings for multiple continuations of each context, as obtained with the probe-tone paradigm, this certainty can be quantified in terms of Shannon entropy. Because expectations arise from s...

  20. Credal Networks under Maximum Entropy

    OpenAIRE

    Lukasiewicz, Thomas

    2013-01-01

    We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy model of its conditional distributions. This result, however, does not hold anymore for general Bayesian networks. We thus present a new kind of maximum entropy models, which are computed sequentially. ...

  1. Life expectancy and education

    DEFF Research Database (Denmark)

    Hansen, Casper Worm; Strulik, Holger

    2017-01-01

    , we find that US states with higher mortality rates from cardiovascular disease prior to the 1970s experienced greater increases in adult life expectancy and higher education enrollment. Our estimates suggest that a one-standard deviation higher treatment intensity is associated with an increase...... in adult life expectancy of 0.37 years and 0.07–0.15 more years of higher education....

  2. Expected Classification Accuracy

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2005-08-01

    Full Text Available Every time we make a classification based on a test score, we should expect some number..of misclassifications. Some examinees whose true ability is within a score range will have..observed scores outside of that range. A procedure for providing a classification table of..true and expected scores is developed for polytomously scored items under item response..theory and applied to state assessment data. A simplified procedure for estimating the..table entries is also presented.

  3. Expected utility without utility

    OpenAIRE

    Castagnoli, E.; Licalzi, M.

    1996-01-01

    This paper advances an interpretation of Von Neumann–Morgenstern’s expected utility model for preferences over lotteries which does not require the notion of a cardinal utility over prizes and can be phrased entirely in the language of probability. According to it, the expected utility of a lottery can be read as the probability that this lottery outperforms another given independent lottery. The implications of this interpretation for some topics and models in decision theory are considered....

  4. Weak scale from the maximum entropy principle

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  5. Sex and life expectancy.

    Science.gov (United States)

    Seifarth, Joshua E; McGowan, Cheri L; Milne, Kevin J

    2012-12-01

    A sexual dimorphism in human life expectancy has existed in almost every country for as long as records have been kept. Although human life expectancy has increased each year, females still live longer, on average, than males. Undoubtedly, the reasons for the sex gap in life expectancy are multifaceted, and it has been discussed from both sociological and biological perspectives. However, even if biological factors make up only a small percentage of the determinants of the sex difference in this phenomenon, parity in average life expectancy should not be anticipated. The aim of this review is to highlight biological mechanisms that may underlie the sexual dimorphism in life expectancy. Using PubMed, ISI Web of Knowledge, and Google Scholar, as well as cited and citing reference histories of articles through August 2012, English-language articles were identified, read, and synthesized into categories that could account for biological sex differences in human life expectancy. The examination of biological mechanisms accounting for the female-based advantage in human life expectancy has been an active area of inquiry; however, it is still difficult to prove the relative importance of any 1 factor. Nonetheless, biological differences between the sexes do exist and include differences in genetic and physiological factors such as progressive skewing of X chromosome inactivation, telomere attrition, mitochondrial inheritance, hormonal and cellular responses to stress, immune function, and metabolic substrate handling among others. These factors may account for at least a part of the female advantage in human life expectancy. Despite noted gaps in sex equality, higher body fat percentages and lower physical activity levels globally at all ages, a sex-based gap in life expectancy exists in nearly every country for which data exist. There are several biological mechanisms that may contribute to explaining why females live longer than men on average, but the complexity of the

  6. Maximum Parsimony on Phylogenetic networks

    Science.gov (United States)

    2012-01-01

    Background Phylogenetic networks are generalizations of phylogenetic trees, that are used to model evolutionary events in various contexts. Several different methods and criteria have been introduced for reconstructing phylogenetic trees. Maximum Parsimony is a character-based approach that infers a phylogenetic tree by minimizing the total number of evolutionary steps required to explain a given set of data assigned on the leaves. Exact solutions for optimizing parsimony scores on phylogenetic trees have been introduced in the past. Results In this paper, we define the parsimony score on networks as the sum of the substitution costs along all the edges of the network; and show that certain well-known algorithms that calculate the optimum parsimony score on trees, such as Sankoff and Fitch algorithms extend naturally for networks, barring conflicting assignments at the reticulate vertices. We provide heuristics for finding the optimum parsimony scores on networks. Our algorithms can be applied for any cost matrix that may contain unequal substitution costs of transforming between different characters along different edges of the network. We analyzed this for experimental data on 10 leaves or fewer with at most 2 reticulations and found that for almost all networks, the bounds returned by the heuristics matched with the exhaustively determined optimum parsimony scores. Conclusion The parsimony score we define here does not directly reflect the cost of the best tree in the network that displays the evolution of the character. However, when searching for the most parsimonious network that describes a collection of characters, it becomes necessary to add additional cost considerations to prefer simpler structures, such as trees over networks. The parsimony score on a network that we describe here takes into account the substitution costs along the additional edges incident on each reticulate vertex, in addition to the substitution costs along the other edges which are

  7. Anomalous vacuum expectation values

    International Nuclear Information System (INIS)

    Suzuki, H.

    1986-01-01

    The anomalous vacuum expectation value is defined as the expectation value of a quantity that vanishes by means of the field equations. Although this value is expected to vanish in quantum systems, regularization in general produces a finite value of this quantity. Calculation of this anomalous vacuum expectation value can be carried out in the general framework of field theory. The result is derived by subtraction of divergences and by zeta-function regularization. Various anomalies are included in these anomalous vacuum expectation values. This method is useful for deriving not only the conformal, chiral, and gravitational anomalies but also the supercurrent anomaly. The supercurrent anomaly is obtained in the case of N = 1 supersymmetric Yang-Mills theory in four, six, and ten dimensions. The original form of the energy-momentum tensor and the supercurrent have anomalies in their conservation laws. But the modification of these quantities to be equivalent to the original one on-shell causes no anomaly in their conservation laws and gives rise to anomalous traces

  8. Background sources at PEP

    International Nuclear Information System (INIS)

    Lynch, H.; Schwitters, R.F.; Toner, W.T.

    1988-01-01

    Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, γ-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs

  9. Cosmic Microwave Background Timeline

    Science.gov (United States)

    Cosmic Microwave Background Timeline 1934 : Richard Tolman shows that blackbody radiation in an will have a blackbody cosmic microwave background with temperature about 5 K 1955: Tigran Shmaonov anisotropy in the cosmic microwave background, this strongly supports the big bang model with gravitational

  10. Multiperiod Maximum Loss is time unit invariant.

    Science.gov (United States)

    Kovacevic, Raimund M; Breuer, Thomas

    2016-01-01

    Time unit invariance is introduced as an additional requirement for multiperiod risk measures: for a constant portfolio under an i.i.d. risk factor process, the multiperiod risk should equal the one period risk of the aggregated loss, for an appropriate choice of parameters and independent of the portfolio and its distribution. Multiperiod Maximum Loss over a sequence of Kullback-Leibler balls is time unit invariant. This is also the case for the entropic risk measure. On the other hand, multiperiod Value at Risk and multiperiod Expected Shortfall are not time unit invariant.

  11. Performance expectation plan

    Energy Technology Data Exchange (ETDEWEB)

    Ray, P.E.

    1998-09-04

    This document outlines the significant accomplishments of fiscal year 1998 for the Tank Waste Remediation System (TWRS) Project Hanford Management Contract (PHMC) team. Opportunities for improvement to better meet some performance expectations have been identified. The PHMC has performed at an excellent level in administration of leadership, planning, and technical direction. The contractor has met and made notable improvement of attaining customer satisfaction in mission execution. This document includes the team`s recommendation that the PHMC TWRS Performance Expectation Plan evaluation rating for fiscal year 1998 be an Excellent.

  12. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  13. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  14. New shower maximum trigger for electrons and photons at CDF

    International Nuclear Information System (INIS)

    Amidei, D.; Burkett, K.; Gerdes, D.; Miao, C.; Wolinski, D.

    1994-01-01

    For the 1994 Tevatron collider run, CDF has upgraded the electron and photo trigger hardware to make use of shower position and size information from the central shower maximum detector. For electrons, the upgrade has resulted in a 50% reduction in backgrounds while retaining approximately 90% of the signal. The new trigger also eliminates the background to photon triggers from single-phototube spikes

  15. New shower maximum trigger for electrons and photons at CDF

    International Nuclear Information System (INIS)

    Gerdes, D.

    1994-08-01

    For the 1994 Tevatron collider run, CDF has upgraded the electron and photon trigger hardware to make use of shower position and size information from the central shower maximum detector. For electrons, the upgrade has resulted in a 50% reduction in backgrounds while retaining approximately 90% of the signal. The new trigger also eliminates the background to photon triggers from single-phototube discharge

  16. Gamma-Gompertz life expectancy at birth

    OpenAIRE

    Trifon I. Missov

    2013-01-01

    BACKGROUND The gamma-Gompertz multiplicative frailty model is the most common parametric modelapplied to human mortality data at adult and old ages. The resulting life expectancy hasbeen calculated so far only numerically. OBJECTIVE Properties of the gamma-Gompertz distribution have not been thoroughly studied. The focusof the paper is to shed light onto its first moment or, demographically speaking, characterizelife expectancy resulting from a gamma-Gompertz force of mortality. The paperprov...

  17. Behavior, Expectations and Status

    Science.gov (United States)

    Webster, Jr, Murray; Rashotte, Lisa Slattery

    2010-01-01

    We predict effects of behavior patterns and status on performance expectations and group inequality using an integrated theory developed by Fisek, Berger and Norman (1991). We next test those predictions using new experimental techniques we developed to control behavior patterns as independent variables. In a 10-condition experiment, predictions…

  18. Life Expectancy in 2040

    DEFF Research Database (Denmark)

    Canudas-Romo, Vladimir; DuGoff, Eva H; Wu, Albert W.

    2016-01-01

    We use expert clinical and public health opinion to estimate likely changes in the prevention and treatment of important disease conditions and how they will affect future life expectancy. Focus groups were held including clinical and public health faculty with expertise in the six leading causes...

  19. Objective Bayesianism and the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Jon Williamson

    2013-09-01

    Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

  20. Spiking the expectancy profiles

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Loui, Psyche; Vuust, Peter

    Melodic expectations have long been quantified using expectedness ratings. Motivated by statistical learning and sharper key profiles in musicians, we model musical learning as a process of reducing the relative entropy between listeners' prior expectancy profiles and probability distributions...... of a given musical style or of stimuli used in short-term experiments. Five previous probe-tone experiments with musicians and non-musicians are revisited. Exp. 1-2 used jazz, classical and hymn melodies. Exp. 3-5 collected ratings before and after exposure to 5, 15 or 400 novel melodies generated from...... a finite-state grammar using the Bohlen-Pierce scale. We find group differences in entropy corresponding to degree and relevance of musical training and within-participant decreases after short-term exposure. Thus, whereas inexperienced listeners make high-entropy predictions by default, statistical...

  1. Chinese students' great expectations

    DEFF Research Database (Denmark)

    Thøgersen, Stig

    2013-01-01

    The article focuses on Chinese students' hopes and expectations before leaving to study abroad. The national political environment for their decision to go abroad is shaped by an official narrative of China's transition to a more creative and innovative economy. Students draw on this narrative to...... system, they think of themselves as having a role in the transformation of Chinese attitudes to education and parent-child relations....

  2. Expectancy Theory Modeling

    Science.gov (United States)

    1982-08-01

    accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy

  3. Great expectations: large wind turbines

    International Nuclear Information System (INIS)

    De Vries, E.

    2001-01-01

    This article focuses on wind turbine product development, and traces the background to wind turbines from the first generation 1.5 MW machines in 1995-6, plans for the second generation 3-5 MW class turbines to meet the expected boom in offshore wind projects, to the anticipated installation of a 4.5 MW turbine, and offshore wind projects planned for 2000-2002. The switch by the market leader Vestas to variable speed operation in 2000, the new product development and marketing strategy taken by the German Pro + Pro consultancy in their design of a 1.5 MW variable speed pitch control concept, the possible limiting of the size of turbines due to logistical difficulties, opportunities offered by air ships for large turbines, and the commissioning of offshore wind farms are discussed. Details of some 2-5 MW offshore wind turbine design specifications are tabulated

  4. Expectations from the child

    Directory of Open Access Journals (Sweden)

    Erdal Atabek

    2018-05-01

    Full Text Available Transition from agricultural society to industry society, from industrial society to science society has taken place. In all these societies, expectations from children also vary. In the agricultural community, human labor is based on arm power. For this reason, expectation from children is to increase work power. Having more children is the basis for the expectations in this community to see that the boy is valuable because he has increased his work power. In the industrial society, the power of the arm changed its place with the machine power. The knowledgeable person is not a family grown-up but a foreman. Childhood was distinguished during this period. It has been investigated that the child has a separate development.  In the information society, communication and information has never been as fast as it is in this period.  The widespread use of the Internet, and the use of social networks such as Facebook and Twitter are in this period. In this society, families are panicked to prepare a future in their own heads for their children. Because the parents thought of their children, they decided about the child's life instead of the child making these decisions. This has had a negative impact on children's sense of autonomy and their ability to take responsibility. To change this, parents should train their children in auto control and develop children's impulse control skills. The children should be able to understand their emotions and make decisions by reasoning and reasoning.

  5. Gamma-Gompertz life expectancy at birth

    Directory of Open Access Journals (Sweden)

    Trifon I. Missov

    2013-02-01

    Full Text Available BACKGROUND The gamma-Gompertz multiplicative frailty model is the most common parametric modelapplied to human mortality data at adult and old ages. The resulting life expectancy hasbeen calculated so far only numerically. OBJECTIVE Properties of the gamma-Gompertz distribution have not been thoroughly studied. The focusof the paper is to shed light onto its first moment or, demographically speaking, characterizelife expectancy resulting from a gamma-Gompertz force of mortality. The paperprovides an exact formula for gamma-Gompertz life expectancy at birth and a simplerhigh-accuracy approximation that can be used in practice for computational convenience.In addition, the article compares actual (life-table to model-based (gamma-Gompertzlife expectancy to assess on aggregate how many years of life expectancy are not captured(or overestimated by the gamma-Gompertz mortality mechanism. COMMENTS A closed-form expression for gamma-Gomeprtz life expectancy at birth contains a special(the hypergeometric function. It aids assessing the impact of gamma-Gompertz parameterson life expectancy values. The paper shows that a high-accuracy approximation canbe constructed by assuming an integer value for the shape parameter of the gamma distribution.A historical comparison between model-based and actual life expectancy forSwedish females reveals a gap that is decreasing to around 2 years from 1950 onwards.Looking at remaining life expectancies at ages 30 and 50, we see this gap almost disappearing.

  6. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  7. Optimal background matching camouflage.

    Science.gov (United States)

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  8. Expected Term Structures

    DEFF Research Database (Denmark)

    Buraschi, Andrea; Piatti, Ilaria; Whelan, Paul

    We construct and study the cross-sectional properties of survey-based bond risk premia and compare them to their traditional statistical counterparts. We document large heterogeneity in skill, identify top forecasters, and learn about the importance of subjective risk premia in long-term bonds...... dynamics. The consensus is not a sufficient statistics of the cross-section of expectations and we propose an alternative real-time aggregate measure of risk premia consistent with Friedmans market selection hypothesis. We then use this measure to evaluate structural models and find support...

  9. Referral expectations of radiology

    International Nuclear Information System (INIS)

    Smith, W.L.; Altmaier, E.; Berberoglu, L.; Morris, K.

    1989-01-01

    The expectation of the referring physician are key to developing a successful practice in radiology. Structured interviews with 17 clinicians in both community care and academic practice documented that accuracy of the radiologic report was the single most important factor in clinician satisfaction. Data intercorrelation showed that accuracy of report correlated with frequency of referral (r = .49). Overall satisfaction of the referring physician with radiology correlated with accuracy (r = .69), patient satisfaction (r = .36), and efficiency in archiving (r = .42). These data may be weighted by departmental managers to allocate resources for improving referring physician satisfaction

  10. Agreeing on expectations

    DEFF Research Database (Denmark)

    Nielsen, Christian; Bentsen, Martin Juul

    Commitment and trust are often mentioned as important aspects of creating a perception of reliability between counterparts. In the context of university-industry collaborations (UICs), agreeing on ambitions and expectations are adamant to achieving outcomes that are equally valuable to all parties...... involved. Despite this, our initial probing indicated that such covenants rarely exist. As such, this paper draws on project management theory and proposes the possibility of structuring assessments of potential partners before university-industry collaborations are brought to life. Our analysis suggests...

  11. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  12. Backgrounds and characteristics of arsonists.

    Science.gov (United States)

    Labree, Wim; Nijman, Henk; van Marle, Hjalmar; Rassin, Eric

    2010-01-01

    The aim of this study was to gain more insight in the backgrounds and characteristics of arsonists. For this, the psychiatric, psychological, personal, and criminal backgrounds of all arsonists (n=25), sentenced to forced treatment in the maximum security forensic hospital "De Kijvelanden", were compared to the characteristics of a control group of patients (n=50), incarcerated at the same institution for other severe crimes. Apart from DSM-IV Axis I and Axis II disorders, family backgrounds, level of education, treatment history, intelligence (WAIS scores), and PCL-R scores were included in the comparisons. Furthermore, the apparent motives for the arson offences were explored. It was found that arsonists had more often received psychiatric treatment, prior to committing their index offence, and had a history of severe alcohol abuse more often in comparison to the controls. The arsonists turned out to be less likely to suffer from a major psychotic disorder. Both groups did not differ significantly on the other variables, among which the PCL-R total scores and factor scores. Exploratory analyses however, did suggest that arsonists may differentiate from non-arsonists on three items of the PCL-R, namely impulsivity (higher scores), superficial charm (lower scores), and juvenile delinquency (lower scores). Although the number of arsonists with a major psychotic disorder was relatively low (28%), delusional thinking of some form was judged to play a role in causing arson crimes in about half of the cases (52%).

  13. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  14. Gender Roles and Expectations

    Directory of Open Access Journals (Sweden)

    Susana A. Eisenchlas

    2013-09-01

    Full Text Available One consequence of the advent of cyber communication is that increasing numbers of people go online to ask for, obtain, and presumably act upon advice dispensed by unknown peers. Just as advice seekers may not have access to information about the identities, ideologies, and other personal characteristics of advice givers, advice givers are equally ignorant about their interlocutors except for the bits of demographic information that the latter may offer freely. In the present study, that information concerns sex. As the sex of the advice seeker may be the only, or the predominant, contextual variable at hand, it is expected that that identifier will guide advice givers in formulating their advice. The aim of this project is to investigate whether and how the sex of advice givers and receivers affects the type of advice, through the empirical analysis of a corpus of web-based Spanish language forums on personal relationship difficulties. The data revealed that, in the absence of individuating information beyond that implicit in the advice request, internalized gender expectations along the lines of agency and communality are the sources from which advice givers draw to guide their counsel. This is despite the trend in discursive practices used in formulating advice, suggesting greater language convergence across sexes.

  15. ATLAS: Exceeding all expectations

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    “One year ago it would have been impossible for us to guess that the machine and the experiments could achieve so much so quickly”, says Fabiola Gianotti, ATLAS spokesperson. The whole chain – from collision to data analysis – has worked remarkably well in ATLAS.   The first LHC proton run undoubtedly exceeded expectations for the ATLAS experiment. “ATLAS has worked very well since the beginning. Its overall data-taking efficiency is greater than 90%”, says Fabiola Gianotti. “The quality and maturity of the reconstruction and simulation software turned out to be better than we expected for this initial stage of the experiment. The Grid is a great success, and right from the beginning it has allowed members of the collaboration all over the world to participate in the data analysis in an effective and timely manner, and to deliver physics results very quickly”. In just a few months of data taking, ATLAS has observed t...

  16. Maximum stellar iron core mass

    Indian Academy of Sciences (India)

    60, No. 3. — journal of. March 2003 physics pp. 415–422. Maximum stellar iron core mass. F W GIACOBBE. Chicago Research Center/American Air Liquide ... iron core compression due to the weight of non-ferrous matter overlying the iron cores within large .... thermal equilibrium velocities will tend to be non-relativistic.

  17. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore. 11 refs., 4 figs

  18. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore

  19. A portable storage maximum thermometer

    International Nuclear Information System (INIS)

    Fayart, Gerard.

    1976-01-01

    A clinical thermometer storing the voltage corresponding to the maximum temperature in an analog memory is described. End of the measurement is shown by a lamp switch out. The measurement time is shortened by means of a low thermal inertia platinum probe. This portable thermometer is fitted with cell test and calibration system [fr

  20. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  1. Cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Wilson, R.W.

    1979-01-01

    The 20-ft horn-reflector antenna at Bell Laboratories is discussed in detail with emphasis on the 7.35 cm radiometer. The circumstances leading to the detection of the cosmic microwave background radiation are explored

  2. Zambia Country Background Report

    DEFF Research Database (Denmark)

    Hampwaye, Godfrey; Jeppesen, Søren; Kragelund, Peter

    This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change).......This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change)....

  3. PENGARUH BACKGROUND MAHASISWA TERHADAP KINERJA AKADEMIK

    Directory of Open Access Journals (Sweden)

    Trianasari Angkawijaya

    2014-09-01

    Full Text Available Abstract: The Effect of Students’ Background on Academic Performance. This study examines the effect of background variables on the academic performance of accounting students in a private university in Surabaya. The background variables under study included previous academic performance, prior knowledge on accounting, sex, motivation, preparedness, and expectations. The results show that previous academic performance, motivation, and expectations have positive and significant effects on the students’ overall academic performance in accounting, while preparedness affects only the students’ performance in management accounting. In contrast, prior knowledge on accounting and sex do not give significant impacts to the students’ overall academic performance.These findings indicate the importance of previous aca­demic performance as well as motivation and expectations as background variables in current academic performance. Keywords: students’ background, academic performance, accounting Abstrak: Pengaruh Background Mahasiswa terhadap Kinerja Akademik. Penelitian ini mengkaji pengaruh variabel background terhadap kinerja akademik mahasiswa akuntansi di Universitas Surabaya. Lima variabel background utama dipergunakan, yaitu kinerja akademik sebelumnya, pengetahuan akun­tansi sebelumnya, jenis kelamin, motivasi, kesiapan, dan ekspektasi. Hipotesis diuji menggunakan model regresi linier berganda OLS dan Robust Standar Error. Hasil penelitian memerlihatkan bahwa kinerja akademik sebelumnya, motivasi, dan ekspektasi memiliki pengaruh positif signifikan terhadap kinerja akademik keseluruhan, sementara kesiapan memberikan pengaruh positif hanya pada kinerja akademik akuntansi manajemen. Sebaliknya, pengetahuan akuntansi sebelumnya dan jenis kelamin tidak memberi­kan pengaruh signifikan terhadap kinerja akademik keseluruhan. Temuan ini mengindikasikan bahwa kinerja akademik sebelumnya beserta motivasi dan ekspektasi adalah variabel background

  4. Energy providers: customer expectations

    International Nuclear Information System (INIS)

    Pridham, N.F.

    1997-01-01

    The deregulation of the gas and electric power industries, and how it will impact on customer service and pricing rates was discussed. This paper described the present situation, reviewed core competencies, and outlined future expectations. The bottom line is that major energy consumers are very conscious of energy costs and go to great lengths to keep them under control. At the same time, solutions proposed to reduce energy costs must benefit all classes of consumers, be they industrial, commercial, institutional or residential. Deregulation and competition at an accelerated pace is the most likely answer. This may be forced by external forces such as foreign energy providers who are eager to enter the Canadian energy market. It is also likely that the competition and convergence between gas and electricity is just the beginning, and may well be overshadowed by other deregulated industries as they determine their core competencies

  5. Customer experiences and expectations

    International Nuclear Information System (INIS)

    Morton, C. R.

    1997-01-01

    Customer experiences and expectations from competition and cogeneration in the power industry were reviewed by Charles Morton, Director of Energy at CPC International, by describing Casco's decision to get into cogeneration in the early 1990s in three small corn milling plants in Cardinal, London and Port Colborne, Ontario, mainly as result of the threat of a 40 per cent increase in power prices. He stressed that cost competitiveness of cogeneration is entirely site-specific, but it is generally more attractive in larger facilities that operate 24 hours a day, where grid power is expensive or unreliable. Because it is reliable, cogeneration holds out the prospect of increased production-up time, as well as offering a hedge against higher energy costs, reducing the company's variable costs when incoming revenues fall short of costs, and providing an additional tool in head-to-head competition

  6. Macro Expectations, Aggregate Uncertainty, and Expected Term Premia

    DEFF Research Database (Denmark)

    Dick, Christian D.; Schmeling, Maik; Schrimpf, Andreas

    Based on individual expectations from the Survey of Professional Forecasters, we construct a realtime proxy for expected term premium changes on long-term bonds. We empirically investigate the relation of these bond term premium expectations with expectations about key macroeconomic variables as ...

  7. A comparative study of expectant parents ' childbirth expectations.

    Science.gov (United States)

    Kao, Bi-Chin; Gau, Meei-Ling; Wu, Shian-Feng; Kuo, Bih-Jaw; Lee, Tsorng-Yeh

    2004-09-01

    The purpose of this study was to understand childbirth expectations and differences in childbirth expectations among expectant parents. For convenience sampling, 200 couples willing to participate in this study were chosen from two hospitals in central Taiwan. Inclusion criteria were at least 36 weeks of gestation, aged 18 and above, no prenatal complications, and willing to consent to participate in this study. Instruments used to collect data included basic demographic data and the Childbirth Expectations Questionnaire. Findings of the study revealed that (1) five factors were identified by expectant parents regarding childbirth expectations including the caregiving environment, expectation of labor pain, spousal support, control and participation, and medical and nursing support; (2) no general differences were identified in the childbirth expectations between expectant fathers and expectant mothers; and (3) expectant fathers with a higher socioeconomic status and who had received prenatal (childbirth) education had higher childbirth expectations, whereas mothers displayed no differences in demographic characteristics. The study results may help clinical healthcare providers better understand differences in expectations during labor and birth and childbirth expectations by expectant parents in order to improve the medical and nursing system and promote positive childbirth experiences and satisfaction for expectant parents.

  8. Direct maximum parsimony phylogeny reconstruction from genotype data

    OpenAIRE

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-01-01

    Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of ge...

  9. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  10. Maximum Water Hammer Sensitivity Analysis

    OpenAIRE

    Jalil Emadi; Abbas Solemani

    2011-01-01

    Pressure waves and Water Hammer occur in a pumping system when valves are closed or opened suddenly or in the case of sudden failure of pumps. Determination of maximum water hammer is considered one of the most important technical and economical items of which engineers and designers of pumping stations and conveyance pipelines should take care. Hammer Software is a recent application used to simulate water hammer. The present study focuses on determining significance of ...

  11. Maximum Gene-Support Tree

    Directory of Open Access Journals (Sweden)

    Yunfeng Shan

    2008-01-01

    Full Text Available Genomes and genes diversify during evolution; however, it is unclear to what extent genes still retain the relationship among species. Model species for molecular phylogenetic studies include yeasts and viruses whose genomes were sequenced as well as plants that have the fossil-supported true phylogenetic trees available. In this study, we generated single gene trees of seven yeast species as well as single gene trees of nine baculovirus species using all the orthologous genes among the species compared. Homologous genes among seven known plants were used for validation of the finding. Four algorithms—maximum parsimony (MP, minimum evolution (ME, maximum likelihood (ML, and neighbor-joining (NJ—were used. Trees were reconstructed before and after weighting the DNA and protein sequence lengths among genes. Rarely a gene can always generate the “true tree” by all the four algorithms. However, the most frequent gene tree, termed “maximum gene-support tree” (MGS tree, or WMGS tree for the weighted one, in yeasts, baculoviruses, or plants was consistently found to be the “true tree” among the species. The results provide insights into the overall degree of divergence of orthologous genes of the genomes analyzed and suggest the following: 1 The true tree relationship among the species studied is still maintained by the largest group of orthologous genes; 2 There are usually more orthologous genes with higher similarities between genetically closer species than between genetically more distant ones; and 3 The maximum gene-support tree reflects the phylogenetic relationship among species in comparison.

  12. LCLS Maximum Credible Beam Power

    International Nuclear Information System (INIS)

    Clendenin, J.

    2005-01-01

    The maximum credible beam power is defined as the highest credible average beam power that the accelerator can deliver to the point in question, given the laws of physics, the beam line design, and assuming all protection devices have failed. For a new accelerator project, the official maximum credible beam power is determined by project staff in consultation with the Radiation Physics Department, after examining the arguments and evidence presented by the appropriate accelerator physicist(s) and beam line engineers. The definitive parameter becomes part of the project's safety envelope. This technical note will first review the studies that were done for the Gun Test Facility (GTF) at SSRL, where a photoinjector similar to the one proposed for the LCLS is being tested. In Section 3 the maximum charge out of the gun for a single rf pulse is calculated. In Section 4, PARMELA simulations are used to track the beam from the gun to the end of the photoinjector. Finally in Section 5 the beam through the matching section and injected into Linac-1 is discussed

  13. Expectations from ethics

    International Nuclear Information System (INIS)

    Fleming, P.

    2008-01-01

    Prof. Patricia Fleming, centred her presentation on ethical expectations in regulating safety for future generations. The challenge is to find a just solution, one that provides for a defensible approach to inter-generational equity. The question on equity is about whether we are permitted to treat generations differently and to still meet the demands of justice. And the question must be asked regarding these differences: 'in what ways do they make a moral difference?' She asked the question regarding the exact meaning of the ethical principle 'Radioactive waste shall be managed in such a way that predicted impacts on the health of future generations will not be greater than relevant levels of impact that are acceptable today'. Some countries have proposed different standards for different time periods, either implicitly or explicitly. In doing so, have they preserved our standards of justice or have they abandoned them? Prof. Fleming identified six points to provide with some moral maps which might be used to negotiate our way to a just solution to the disposal of nuclear waste. (author)

  14. Expected Signal Observability at Future Experiments

    CERN Document Server

    Bartsch, Valeria

    2005-01-01

    Several methods to quantify the ''significance'' of an expected signal at future experiments have been used or suggested in literature. In this note, comparisons are presented with a method based on the likelihood ratio of the ''background hypothesis'' and the ''signal-plus-background hypothesis''. A large number of Monte Carlo experiments are performed to investigate the properties of the various methods and to check whether the probability of a background fluctuation having produced the claimed significance of the discovery is properly described. In addition, the best possible separation between the two hypotheses should be provided, in other words, the discovery potential of a future experiment be maximal. Finally, a practical method to apply a likelihood-based definition of the significance is suggested in this note. Signal and background contributions are determined from a likelihoo d fit based on shapes only, and the probability density distributions of the significance thus determined are found to be o...

  15. The natural radiation background

    International Nuclear Information System (INIS)

    Duggleby, J.C.

    1982-01-01

    The components of the natural background radiation and their variations are described. Cosmic radiation is a major contributor to the external dose to the human body whilst naturally-occurring radionuclides of primordial and cosmogenic origin contribute to both the external and internal doses, with the primordial radionuclides being the major contributor in both cases. Man has continually modified the radiation dose to which he has been subjected. The two traditional methods of measuring background radiation, ionisation chamber measurements and scintillation counting, are looked at and the prospect of using thermoluminescent dosimetry is considered

  16. Effects of background radiation

    International Nuclear Information System (INIS)

    Knox, E.G.; Stewart, A.M.; Gilman, E.A.; Kneale, G.W.

    1987-01-01

    The primary objective of this investigation is to measure the relationship between exposure to different levels of background gamma radiation in different parts of the country, and different Relative Risks for leukaemias and cancers in children. The investigation is linked to an earlier analysis of the effects of prenatal medical x-rays upon leukaemia and cancer risk; the prior hypothesis on which the background-study was based, is derived from the earlier results. In a third analysis, the authors attempted to measure varying potency of medical x-rays delivered at different stages of gestation and the results supply a link between the other two estimates. (author)

  17. The Cosmic Background Explorer

    Science.gov (United States)

    Gulkis, Samuel; Lubin, Philip M.; Meyer, Stephan S.; Silverberg, Robert F.

    1990-01-01

    The Cosmic Background Explorer (CBE), NASA's cosmological satellite which will observe a radiative relic of the big bang, is discussed. The major questions connected to the big bang theory which may be clarified using the CBE are reviewed. The satellite instruments and experiments are described, including the Differential Microwave Radiometer, which measures the difference between microwave radiation emitted from two points on the sky, the Far-Infrared Absolute Spectrophotometer, which compares the spectrum of radiation from the sky at wavelengths from 100 microns to one cm with that from an internal blackbody, and the Diffuse Infrared Background Experiment, which searches for the radiation from the earliest generation of stars.

  18. Maximum physical capacity testing in cancer patients undergoing chemotherapy

    DEFF Research Database (Denmark)

    Knutsen, L.; Quist, M; Midtgaard, J

    2006-01-01

    BACKGROUND: Over the past few years there has been a growing interest in the field of physical exercise in rehabilitation of cancer patients, leading to requirements for objective maximum physical capacity measurement (maximum oxygen uptake (VO(2max)) and one-repetition maximum (1RM)) to determin...... early in the treatment process. However, the patients were self-referred and thus highly motivated and as such are not necessarily representative of the whole population of cancer patients treated with chemotherapy....... in performing maximum physical capacity tests as these motivated them through self-perceived competitiveness and set a standard that served to encourage peak performance. CONCLUSION: The positive attitudes in this sample towards maximum physical capacity open the possibility of introducing physical testing...

  19. Expectations from Society

    International Nuclear Information System (INIS)

    Blowers, A.

    2008-01-01

    Prof. A. Blowers observed that the social context within which radioactive waste management is considered has evolved over time. The early period where radioactive waste was a non-issue was succeeded by a period of intense conflict over solutions. The contemporary context is more consensual, in which solutions are sought that are both technically sound and socially acceptable. Among the major issues is that of inter-generational equity embraced in the question: how long can or should our responsibility to the future extend? He pointed out the differences in timescales. On the one hand, geo-scientific timescales are very long term, emphasizing the issue of how far into the future it is possible to make predictions about repository safety. By contrast, socio cultural timescales are much shorter, focusing on the foreseeable future of one or two generations and raising the issue of how far into the future we should be concerned. He listed. the primary expectations from society which are: safety and security to alleviate undue burdens to future generations and flexibility in order to enable the future generations to have a stake in decision making. The need to reconcile the two had led to a contemporary emphasis on phased geological disposal incorporating retrievability. However, the long timescales for implementation of disposal provided for sufficient flexibility without the need for retrievability. Future generations would inevitably have sold stake in decision making. Prof. A.. Blowers pointed out that society is also concerned with participation in decision making for implementation. The key elements for success are: openness and transparency, staged process, participation, partnership, benefits to enhance the well being of communities and a democratic framework for decision making, including the ratification of key decisions and the right for communities to withdraw from the process up to a predetermined point. This approach for decision making may also have

  20. Generic maximum likely scale selection

    DEFF Research Database (Denmark)

    Pedersen, Kim Steenstrup; Loog, Marco; Markussen, Bo

    2007-01-01

    in this work is on applying this selection principle under a Brownian image model. This image model provides a simple scale invariant prior for natural images and we provide illustrative examples of the behavior of our scale estimation on such images. In these illustrative examples, estimation is based......The fundamental problem of local scale selection is addressed by means of a novel principle, which is based on maximum likelihood estimation. The principle is generally applicable to a broad variety of image models and descriptors, and provides a generic scale estimation methodology. The focus...

  1. Thermal background noise limitations

    Science.gov (United States)

    Gulkis, S.

    1982-01-01

    Modern detection systems are increasingly limited in sensitivity by the background thermal photons which enter the receiving system. Expressions for the fluctuations of detected thermal radiation are derived. Incoherent and heterodyne detection processes are considered. References to the subject of photon detection statistics are given.

  2. Berkeley Low Background Facility

    International Nuclear Information System (INIS)

    Thomas, K. J.; Norman, E. B.; Smith, A. R.; Poon, A. W. P.; Chan, Y. D.; Lesko, K. T.

    2015-01-01

    The Berkeley Low Background Facility (BLBF) at Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background laboratory on the surface at LBNL and at the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products; active screening via neutron activation analysis for U,Th, and K as well as a variety of stable isotopes; and neutron flux/beam characterization measurements through the use of monitors. A general overview of the facilities, services, and sensitivities will be presented. Recent activities and upgrades will also be described including an overview of the recently installed counting system at SURF (recently relocated from Oroville, CA in 2014), the installation of a second underground counting station at SURF in 2015, and future plans. The BLBF is open to any users for counting services or collaboration on a wide variety of experiments and projects

  3. Optimal Constellation Design for Maximum Continuous Coverage of Targets Against a Space Background

    Science.gov (United States)

    2012-05-31

    B2B : R sin 2γ > h and h > 0 The cutting plane...53. 56 x̂ ẑ φ γ γR sin 2γ h (a) |φ|+ γ < π/2 x̂ ẑ φ γ γ R sin 2γ h (b) |φ|+ γ > π/2 Figure 50: Case B2B x̂ ŷ √ R2 − h2 p1 p2+ p2− f (a) |φ|+ γ < π...2 x̂ ŷ √ R2 − h2 p1 p2+ p2− f (b) |φ|+ γ > π/2 Figure 51: Case B2B 57 x̂ ẑ φ γ γ R h (a) |φ|+ γ < π/2 x̂ ẑ φ γ γ R h (b) |φ|+ γ > π/2 Figure

  4. Extreme Maximum Land Surface Temperatures.

    Science.gov (United States)

    Garratt, J. R.

    1992-09-01

    There are numerous reports in the literature of observations of land surface temperatures. Some of these, almost all made in situ, reveal maximum values in the 50°-70°C range, with a few, made in desert regions, near 80°C. Consideration of a simplified form of the surface energy balance equation, utilizing likely upper values of absorbed shortwave flux (1000 W m2) and screen air temperature (55°C), that surface temperatures in the vicinity of 90°-100°C may occur for dry, darkish soils of low thermal conductivity (0.1-0.2 W m1 K1). Numerical simulations confirm this and suggest that temperature gradients in the first few centimeters of soil may reach 0.5°-1°C mm1 under these extreme conditions. The study bears upon the intrinsic interest of identifying extreme maximum temperatures and yields interesting information regarding the comfort zone of animals (including man).

  5. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  6. Expectations from implementers

    International Nuclear Information System (INIS)

    Biurrun, E.; Zuidema, P.

    2008-01-01

    Enrique Biurrun (DBE) presented the expectations from the implementer. He explained that the implementer needs a framework to successfully develop a repository which means the definition of requirements and guidance (for repository system development, analysis, licences, etc.) as well as the decision-making process (stepwise approach, roles of different players, etc.). He also needs a reasonable stability of the regulatory system. The regulatory framework should be developed in a clear, reasonable and consistent manner. In the context of the long duration of the project (100 years) there will be technological progress. In that context E. Biurrun asked what is the meaning of best practice. How can one deal with judgmental issues in a step-wise approach? Regulatory criteria and guidance must deal with the repository system for which an iterative process is necessary where dialogue is needed with the regulator despite the need to maintain his independence. The safety case, which is a periodic documentation of the status of the project, must provide a synthesis of the underlying scientific understanding and evidence and becomes part of the design process through feedback. E. Biurrun pointed out that safety is not calculated or assessed, but designed and built into the repository system (by geological and engineered barriers). He stressed the importance of the operational aspects since the implementer has to build and operate the repository safely. He asked the question: is it 'Ethical' to buy 'peace of mind' of some stakeholders with casualties of the implementer's staff because of mining accidents if the repository is left open during a phase of reversibility. The implementer needs dependable criteria, legal security and investment security. He interpreted the 'Precautionary principle' as meaning 'do it now'. Long-lasting solutions are very uncertain. Will we heave the money and the technology to do it later? He made some reflections regarding the ethical need to

  7. A background of risks

    International Nuclear Information System (INIS)

    Griffiths, R.F.

    1981-01-01

    The subject is reviewed under the headings: introduction (historical and general description of harm, hazards and risk and attempts to define them); expressions of risk (individual risk; fatal accident frequency rate; expressions of risk in terms of deaths suffered per unit of activity; loss of life expectancy; frequency vs consequence lines); comparability of risks. The examples include some references to radiation hazards and reactor accidents. (U.K.)

  8. Making maps of the cosmic microwave background: The MAXIMA example

    Science.gov (United States)

    Stompor, Radek; Balbi, Amedeo; Borrill, Julian D.; Ferreira, Pedro G.; Hanany, Shaul; Jaffe, Andrew H.; Lee, Adrian T.; Oh, Sang; Rabii, Bahman; Richards, Paul L.; Smoot, George F.; Winant, Celeste D.; Wu, Jiun-Huei Proty

    2002-01-01

    This work describes cosmic microwave background (CMB) data analysis algorithms and their implementations, developed to produce a pixelized map of the sky and a corresponding pixel-pixel noise correlation matrix from time ordered data for a CMB mapping experiment. We discuss in turn algorithms for estimating noise properties from the time ordered data, techniques for manipulating the time ordered data, and a number of variants of the maximum likelihood map-making procedure. We pay particular attention to issues pertinent to real CMB data, and present ways of incorporating them within the framework of maximum likelihood map making. Making a map of the sky is shown to be not only an intermediate step rendering an image of the sky, but also an important diagnostic stage, when tests for and/or removal of systematic effects can efficiently be performed. The case under study is the MAXIMA-I data set. However, the methods discussed are expected to be applicable to the analysis of other current and forthcoming CMB experiments.

  9. [Cosmic Microwave Background (CMB) Anisotropies

    Science.gov (United States)

    Silk, Joseph

    1998-01-01

    One of the main areas of research is the theory of cosmic microwave background (CMB) anisotropies and analysis of CMB data. Using the four year COBE data we were able to improve existing constraints on global shear and vorticity. We found that, in the flat case (which allows for greatest anisotropy), (omega/H)0 less than 10(exp -7), where omega is the vorticity and H is the Hubble constant. This is two orders of magnitude lower than the tightest, previous constraint. We have defined a new set of statistics which quantify the amount of non-Gaussianity in small field cosmic microwave background maps. By looking at the distribution of power around rings in Fourier space, and at the correlations between adjacent rings, one can identify non-Gaussian features which are masked by large scale Gaussian fluctuations. This may be particularly useful for identifying unresolved localized sources and line-like discontinuities. Levin and collaborators devised a method to determine the global geometry of the universe through observations of patterns in the hot and cold spots of the CMB. We have derived properties of the peaks (maxima) of the CMB anisotropies expected in flat and open CDM models. We represent results for angular resolutions ranging from 5 arcmin to 20 arcmin (antenna FWHM), scales that are relevant for the MAP and COBRA/SAMBA space missions and the ground-based interferometer. Results related to galaxy formation and evolution are also discussed.

  10. Social gradient in life expectancy and health expectancy in Denmark

    DEFF Research Database (Denmark)

    Brønnum-Hansen, Henrik; Andersen, Otto; Kjøller, Mette

    2004-01-01

    Health status of a population can be evaluated by health expectancy expressed as average lifetime in various states of health. The purpose of the study was to compare health expectancy in population groups at high, medium and low educational levels.......Health status of a population can be evaluated by health expectancy expressed as average lifetime in various states of health. The purpose of the study was to compare health expectancy in population groups at high, medium and low educational levels....

  11. Fractal and topological sustainable methods of overcoming expected uncertainty in the radiolocation of low-contrast targets and in the processing of weak multi-dimensional signals on the background of high-intensity noise: A new direction in the statistical decision theory

    Science.gov (United States)

    Potapov, A. A.

    2017-11-01

    The main purpose of this work is to interpret the main directions of radio physics, radio engineering and radio location in “fractal” language that makes new ways and generalizations on future promising radio systems. We introduce a new kind and approach of up-to-date radiolocation: fractal-scaling or scale-invariant radiolocation. The new topologic signs and methods of detecting the low-contrast objects against the high-intensity noise background are presented. It leads to basic changes in the theoretical radiolocation structure itself and also in its mathematical apparatus. The fractal radio systems conception, sampling topology, global fractal-scaling approach and the fractal paradigm underlie the scientific direction established by the author in Russia and all over the world for the first time ever.

  12. System for memorizing maximum values

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1992-08-01

    The invention discloses a system capable of memorizing maximum sensed values. The system includes conditioning circuitry which receives the analog output signal from a sensor transducer. The conditioning circuitry rectifies and filters the analog signal and provides an input signal to a digital driver, which may be either linear or logarithmic. The driver converts the analog signal to discrete digital values, which in turn triggers an output signal on one of a plurality of driver output lines n. The particular output lines selected is dependent on the converted digital value. A microfuse memory device connects across the driver output lines, with n segments. Each segment is associated with one driver output line, and includes a microfuse that is blown when a signal appears on the associated driver output line.

  13. Remarks on the maximum luminosity

    Science.gov (United States)

    Cardoso, Vitor; Ikeda, Taishi; Moore, Christopher J.; Yoo, Chul-Moon

    2018-04-01

    The quest for fundamental limitations on physical processes is old and venerable. Here, we investigate the maximum possible power, or luminosity, that any event can produce. We show, via full nonlinear simulations of Einstein's equations, that there exist initial conditions which give rise to arbitrarily large luminosities. However, the requirement that there is no past horizon in the spacetime seems to limit the luminosity to below the Planck value, LP=c5/G . Numerical relativity simulations of critical collapse yield the largest luminosities observed to date, ≈ 0.2 LP . We also present an analytic solution to the Einstein equations which seems to give an unboundedly large luminosity; this will guide future numerical efforts to investigate super-Planckian luminosities.

  14. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  15. Scintillation counter, maximum gamma aspect

    International Nuclear Information System (INIS)

    Thumim, A.D.

    1975-01-01

    A scintillation counter, particularly for counting gamma ray photons, includes a massive lead radiation shield surrounding a sample-receiving zone. The shield is disassembleable into a plurality of segments to allow facile installation and removal of a photomultiplier tube assembly, the segments being so constructed as to prevent straight-line access of external radiation through the shield into radiation-responsive areas. Provisions are made for accurately aligning the photomultiplier tube with respect to one or more sample-transmitting bores extending through the shield to the sample receiving zone. A sample elevator, used in transporting samples into the zone, is designed to provide a maximum gamma-receiving aspect to maximize the gamma detecting efficiency. (U.S.)

  16. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  17. Multiple chronic conditions and life expectancy

    DEFF Research Database (Denmark)

    DuGoff, Eva H; Canudas-Romo, Vladimir; Buttorff, Christine

    2014-01-01

    BACKGROUND: The number of people living with multiple chronic conditions is increasing, but we know little about the impact of multimorbidity on life expectancy. OBJECTIVE: We analyze life expectancy in Medicare beneficiaries by number of chronic conditions. RESEARCH DESIGN: A retrospective cohort...... study using single-decrement period life tables. SUBJECTS: Medicare fee-for-service beneficiaries (N=1,372,272) aged 67 and older as of January 1, 2008. MEASURES: Our primary outcome measure is life expectancy. We categorize study subjects by sex, race, selected chronic conditions (heart disease, cancer...... and increasing numbers of comorbid conditions. CONCLUSIONS: Social Security and Medicare actuaries should account for the growing number of beneficiaries with multiple chronic conditions when determining population projections and trust fund solvency....

  18. Ethanol Transportation Backgrounder

    OpenAIRE

    Denicoff, Marina R.

    2007-01-01

    For the first 6 months of 2007, U.S. ethanol production totaled nearly 3 billion gallons—32 percent higher than the same period last year. As of August 29, there were 128 ethanol plants with annual production capacity totaling 6.78 billion gallons, and an additional 85 plants were under construction. U.S. ethanol production capacity is expanding rapidly and is currently expected to exceed 13 billion gallons per year by early 2009, if not sooner. Ethanol demand has increased corn prices and le...

  19. Fertility expectations and residential mobility in Britain

    Directory of Open Access Journals (Sweden)

    John Ermisch

    2016-12-01

    Full Text Available Background: It is plausible that people take into account anticipated changes in family size in choosing where to live. But estimation of the impact of anticipated events on current transitions in an event history framework is challenging because expectations must be measured in some way and, like indicators of past childbearing, expected future childbearing may be endogenous with respect to housing decisions. Objective: The objective of the study is to estimate how expected changes in family size affect residential movement in Great Britain in a way which addresses these challenges. Methods: We use longitudinal data from a mature 18-wave panel survey, the British Household Panel Survey, which incorporates a direct measure of fertility expectations. The statistical methods allow for the potential endogeneity of expectations in our estimation and testing framework. Results: We produce evidence consistent with the idea that past childbearing mainly affects residential mobility through expectations of future childbearing, not directly through the number of children in the household. But there is heterogeneity in response. In particular, fertility expectations have a much greater effect on mobility among women who face lower costs of mobility, such as private tenants. Conclusions: Our estimates indicate that expecting to have a(nother child in the future increases the probability of moving by about 0.036 on average, relative to an average mobility rate of 0.14 per annum in our sample. Contribution: Our contribution is to incorporate anticipation of future events into an empirical model of residential mobility. We also shed light on how childbearing affects mobility.

  20. Gains in Life Expectancy Associated with Higher Education in Men

    NARCIS (Netherlands)

    Bijwaard, G.E.; van Poppel, F.W.A.; Ekamper, Peter; Lumey, L.H.

    2015-01-01

    Background Many studies show large differences in life expectancy across the range of education, intelligence, and socio-economic status. As educational attainment, intelligence, and socio-economic status are highly interrelated, appropriate methods are required to disentangle their separate

  1. Family Background and Entrepreneurship

    DEFF Research Database (Denmark)

    Lindquist, Matthew J.; Sol, Joeri; Van Praag, Mirjam

    Vast amounts of money are currently being spent on policies aimed at promoting entrepreneurship. The success of such policies, however, rests in part on the assumption that individuals are not ‘born entrepreneurs’. In this paper, we assess the importance of family background and neighborhood...... effects as determinants of entrepreneurship. We start by estimating sibling correlations in entrepreneurship. We find that between 20 and 50 percent of the variance in different entrepreneurial outcomes is explained by factors that siblings share. The average is 28 percent. Allowing for differential...... entrepreneurship does play a large role, as do shared genes....

  2. Malaysia; Background Paper

    OpenAIRE

    International Monetary Fund

    1996-01-01

    This Background Paper on Malaysia examines developments and trends in the labor market since the mid-1980s. The paper describes the changes in the employment structure and the labor force. It reviews wages and productivity trends and their effects on unit labor cost. The paper highlights that Malaysia’s rapid growth, sustained since 1987, has had a major impact on the labor market. The paper outlines the major policy measures to address the labor constraints. It also analyzes Malaysia’s recen...

  3. Maximum entropy and Bayesian methods

    International Nuclear Information System (INIS)

    Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.

    1992-01-01

    Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come

  4. The projected background for the CUORE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Hickerson, K.P.; Huang, H.Z.; Sakai, M.; Schmidt, J.; Trentalange, S.; Zhu, B.X. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Banks, T.I.; Drobizhev, A.; Freedman, S.J.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S.; Laubenstein, M. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Carbone, L.; Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Pettinacci, V.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Fernandes, G.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V.; Tenconi, M. [Universit Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Zaragoza (Spain); Moggi, N. [INFN-Sezione di Bologna, Bologna (Italy); Alma Mater Studiorum-Universita di Bologna, Dipartimento di Scienze per la Qualita della Vita, Bologna (Italy); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (France); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (United States); University of California, Department of Nuclear Engineering, Berkeley, CA (United States); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (United States); Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Wise, T. [Yale University, Department of Physics, New Haven, CT (United States); University of Wisconsin, Department of Physics, Madison, WI (United States); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (United Kingdom); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (United States); Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Alma Mater Studiorum-Universita di Bologna, Dipartimento di Fisica e Astronomia, Bologna (Italy)

    2017-08-15

    The Cryogenic Underground Observatory for Rare Events (CUORE) is designed to search for neutrinoless double beta decay of {sup 130}Te with an array of 988 TeO{sub 2} bolometers operating at temperatures around 10 mK. The experiment is currently being commissioned in Hall A of Laboratori Nazionali del Gran Sasso, Italy. The goal of CUORE is to reach a 90% C.L. exclusion sensitivity on the {sup 130}Te decay half-life of 9 x 10{sup 25} years after 5 years of data taking. The main issue to be addressed to accomplish this aim is the rate of background events in the region of interest, which must not be higher than 10{sup -2} counts/keV/kg/year. We developed a detailed Monte Carlo simulation, based on results from a campaign of material screening, radioassays, and bolometric measurements, to evaluate the expected background. This was used over the years to guide the construction strategies of the experiment and we use it here to project a background model for CUORE. In this paper we report the results of our study and our expectations for the background rate in the energy region where the peak signature of neutrinoless double beta decay of {sup 130}Te is expected. (orig.)

  5. The projected background for the CUORE experiment

    Science.gov (United States)

    Alduino, C.; Alfonso, K.; Artusa, D. R.; Avignone, F. T.; Azzolini, O.; Banks, T. I.; Bari, G.; Beeman, J. W.; Bellini, F.; Benato, G.; Bersani, A.; Biassoni, M.; Branca, A.; Brofferio, C.; Bucci, C.; Camacho, A.; Caminata, A.; Canonica, L.; Cao, X. G.; Capelli, S.; Cappelli, L.; Carbone, L.; Cardani, L.; Carniti, P.; Casali, N.; Cassina, L.; Chiesa, D.; Chott, N.; Clemenza, M.; Copello, S.; Cosmelli, C.; Cremonesi, O.; Creswick, R. J.; Cushman, J. S.; D'Addabbo, A.; Dafinei, I.; Davis, C. J.; Dell'Oro, S.; Deninno, M. M.; Di Domizio, S.; Di Vacri, M. L.; Drobizhev, A.; Fang, D. Q.; Faverzani, M.; Fernandes, G.; Ferri, E.; Ferroni, F.; Fiorini, E.; Franceschi, M. A.; Freedman, S. J.; Fujikawa, B. K.; Giachero, A.; Gironi, L.; Giuliani, A.; Gladstone, L.; Gorla, P.; Gotti, C.; Gutierrez, T. D.; Haller, E. E.; Han, K.; Hansen, E.; Heeger, K. M.; Hennings-Yeomans, R.; Hickerson, K. P.; Huang, H. Z.; Kadel, R.; Keppel, G.; Kolomensky, Yu. G.; Leder, A.; Ligi, C.; Lim, K. E.; Ma, Y. G.; Maino, M.; Marini, L.; Martinez, M.; Maruyama, R. H.; Mei, Y.; Moggi, N.; Morganti, S.; Mosteiro, P. J.; Napolitano, T.; Nastasi, M.; Nones, C.; Norman, E. B.; Novati, V.; Nucciotti, A.; O'Donnell, T.; Ouellet, J. L.; Pagliarone, C. E.; Pallavicini, M.; Palmieri, V.; Pattavina, L.; Pavan, M.; Pessina, G.; Pettinacci, V.; Piperno, G.; Pira, C.; Pirro, S.; Pozzi, S.; Previtali, E.; Rosenfeld, C.; Rusconi, C.; Sakai, M.; Sangiorgio, S.; Santone, D.; Schmidt, B.; Schmidt, J.; Scielzo, N. D.; Singh, V.; Sisti, M.; Smith, A. R.; Taffarello, L.; Tenconi, M.; Terranova, F.; Tomei, C.; Trentalange, S.; Vignati, M.; Wagaarachchi, S. L.; Wang, B. S.; Wang, H. W.; Welliver, B.; Wilson, J.; Winslow, L. A.; Wise, T.; Woodcraft, A.; Zanotti, L.; Zhang, G. Q.; Zhu, B. X.; Zimmermann, S.; Zucchelli, S.; Laubenstein, M.

    2017-08-01

    The Cryogenic Underground Observatory for Rare Events (CUORE) is designed to search for neutrinoless double beta decay of ^{130}Te with an array of 988 TeO_2 bolometers operating at temperatures around 10 mK. The experiment is currently being commissioned in Hall A of Laboratori Nazionali del Gran Sasso, Italy. The goal of CUORE is to reach a 90% C.L. exclusion sensitivity on the ^{130}Te decay half-life of 9 × 10^{25} years after 5 years of data taking. The main issue to be addressed to accomplish this aim is the rate of background events in the region of interest, which must not be higher than 10^{-2} counts/keV/kg/year. We developed a detailed Monte Carlo simulation, based on results from a campaign of material screening, radioassays, and bolometric measurements, to evaluate the expected background. This was used over the years to guide the construction strategies of the experiment and we use it here to project a background model for CUORE. In this paper we report the results of our study and our expectations for the background rate in the energy region where the peak signature of neutrinoless double beta decay of ^{130}Te is expected.

  6. Patient (customer) expectations in hospitals.

    Science.gov (United States)

    Bostan, Sedat; Acuner, Taner; Yilmaz, Gökhan

    2007-06-01

    The expectations of patient are one of the determining factors of healthcare service. The purpose of this study is to measure the Patients' Expectations, based on Patient's Rights. This study was done with Likert-Survey in Trabzon population. The analyses showed that the level of the expectations of the patient was high on the factor of receiving information and at an acceptable level on the other factors. Statistical meaningfulness was determined between age, sex, education, health insurance, and the income of the family and the expectations of the patients (pstudy, the current legal regulations have higher standards than the expectations of the patients. The reason that the satisfaction of the patients high level is interpreted due to the fact that the level of the expectation is low. It is suggested that the educational and public awareness studies on the patients' rights must be done in order to increase the expectations of the patients.

  7. Robust estimation of the noise variance from background MR data

    NARCIS (Netherlands)

    Sijbers, J.; Den Dekker, A.J.; Poot, D.; Bos, R.; Verhoye, M.; Van Camp, N.; Van der Linden, A.

    2006-01-01

    In the literature, many methods are available for estimation of the variance of the noise in magnetic resonance (MR) images. A commonly used method, based on the maximum of the background mode of the histogram, is revisited and a new, robust, and easy to use method is presented based on maximum

  8. Gender difference on patients' satisfaction and expectation towards ...

    African Journals Online (AJOL)

    Background: Recognizing patient satisfaction and expectation is considered as important components of assessing quality of care. Aim: The aim of this study was to determine the gender difference on the patient satisfaction with psychiatrists and explore their expectation from physicians to mental health care needs. Design: ...

  9. The development of the Patient Expectations of Shoulder Surgery survey

    NARCIS (Netherlands)

    Koorevaar, Rinco C T; Haanstra, Tsjitske; Van't Riet, Esther; Lambers Heerspink, Okke F O; Bulstra, Sjoerd K

    2017-01-01

    BACKGROUND: Patient satisfaction after a surgical procedure is dependent on meeting preoperative expectations. There is currently no patient expectations survey available for patients undergoing shoulder surgery that is validated, reliable, and easy to use in daily practice. The aim of this study

  10. Maximum entropy principal for transportation

    International Nuclear Information System (INIS)

    Bilich, F.; Da Silva, R.

    2008-01-01

    In this work we deal with modeling of the transportation phenomenon for use in the transportation planning process and policy-impact studies. The model developed is based on the dependence concept, i.e., the notion that the probability of a trip starting at origin i is dependent on the probability of a trip ending at destination j given that the factors (such as travel time, cost, etc.) which affect travel between origin i and destination j assume some specific values. The derivation of the solution of the model employs the maximum entropy principle combining a priori multinomial distribution with a trip utility concept. This model is utilized to forecast trip distributions under a variety of policy changes and scenarios. The dependence coefficients are obtained from a regression equation where the functional form is derived based on conditional probability and perception of factors from experimental psychology. The dependence coefficients encode all the information that was previously encoded in the form of constraints. In addition, the dependence coefficients encode information that cannot be expressed in the form of constraints for practical reasons, namely, computational tractability. The equivalence between the standard formulation (i.e., objective function with constraints) and the dependence formulation (i.e., without constraints) is demonstrated. The parameters of the dependence-based trip-distribution model are estimated, and the model is also validated using commercial air travel data in the U.S. In addition, policy impact analyses (such as allowance of supersonic flights inside the U.S. and user surcharge at noise-impacted airports) on air travel are performed.

  11. Macro Expectations, Aggregate Uncertainty, and Expected Term Premia

    DEFF Research Database (Denmark)

    Dick, Christian D.; Schmeling, Maik; Schrimpf, Andreas

    2013-01-01

    as well as aggregate macroeconomic uncertainty at the level of individual forecasters. We find that expected term premia are (i) time-varying and reasonably persistent, (ii) strongly related to expectations about future output growth, and (iii) positively affected by uncertainty about future output growth...... and in ation rates. Expectations about real macroeconomic variables seem to matter more than expectations about nominal factors. Additional findings on term structure factors suggest that the level and slope factor capture information related to uncertainty about real and nominal macroeconomic prospects...

  12. Backgrounded but not peripheral

    DEFF Research Database (Denmark)

    Hovmark, Henrik

    2013-01-01

    .e. the schema enters into apparently contradictory constructions of the informants’ local home-base and, possibly, of their identity (cf. Hovmark, 2010). Second, I discuss the status and role of the specific linguistic category in question, i.e. the directional adverbs. On the one hand we claim that the DDAs......In this paper I pay a closer look at the use of the CENTRE-PERIPHERY schema in context. I address two specific issues: first, I show how the CENTRE-PERIPHERY schema, encoded in the DDAs, enters into discourses that conceptualize and characterize a local community as both CENTRE and PERIPHERY, i......; furthermore, the DDAs are backgrounded in discourse. Is it reasonable to claim, rather boldly, that “the informants express their identity in the use of the directional adverb ud ‘out’ etc.”? In the course of this article, however, I suggest that the DDAs in question do contribute to the socio...

  13. OCRWM Backgrounder, January 1987

    International Nuclear Information System (INIS)

    1987-01-01

    The Nuclear Waste Policy Act of 1982 (NWPA) assigns to the US Department of Energy (DOE) responsibility for developing a system to safely and economically transport spent nuclear fuel and high-level radioactive waste from various storage sites to geologic repositories or other facilities that constitute elements of the waste management program. This transportation system will evolve from technologies and capabilities already developed. Shipments of spent fuel to a monitored retrievable storage (MRS) facility could begin as early as 1996 if Congress authorizes its construction. Shipments of spent fuel to a geologic repository are scheduled to begin in 1998. The backgrounder provides an overview of DOE's cask development program. Transportation casks are a major element in the DOE nuclear waste transportation system because they are the primary protection against any potential radiation exposure to the public and transportation workers in the event an accident occurs

  14. Monitored background radiometer

    International Nuclear Information System (INIS)

    Ruel, C.

    1988-01-01

    A monitored background radiometer is described comprising: a thermally conductive housing; low conductivity support means mounted on the housing; a sensing plate mounted on the low conductivity support means and spaced from the housing so as to be thermally insulated from the housing and having an outwardly facing first surface; the sensing plate being disposed relative to the housing to receive direct electromagnetic radiation from sources exterior to the radiometer upon the first surface only; means for controllably heating the sensing plate; first temperature sensitive means to measure the temperature of the housing; and second temperature sensitive means to measure the temperature of the sensing plate, so that the heat flux at the sensing plate may be determined from the temperatures of the housing and sensing plate after calibration of the radiometer by measuring the temperatures of the housing and sensing plate while controllably heating the sensing plate

  15. Unexpected Expectations The Curiosities of a Mathematical Crystal Ball

    CERN Document Server

    Wapner, Leonard M

    2012-01-01

    Unexpected Expectations: The Curiosities of a Mathematical Crystal Ball explores how paradoxical challenges involving mathematical expectation often necessitate a reexamination of basic premises. The author takes you through mathematical paradoxes associated with seemingly straightforward applications of mathematical expectation and shows how these unexpected contradictions may push you to reconsider the legitimacy of the applications. The book requires only an understanding of basic algebraic operations and includes supplemental mathematical background in chapter appendices. After a history o

  16. Introduction and background

    International Nuclear Information System (INIS)

    Kittel, J.H.

    1989-01-01

    Near-surface land disposal of low-level and intermediate-level radioactive wastes has been practiced since the early 1940's. Near-surface disposal is the terminal emplacement of radioactive wastes in facilities that are on or near the earth's surface. The maximum depth to disposal facilities that are below grade is typically 30 meters or less. Near-surface disposal facilities can be broadly classed as unlined earthen trenches covered typically with one to three meters of soil or clay (shallow land burial), earth-covered tumuli, above-ground vaults, below-ground vaults, some abandoned mines and rock cavities, modular concrete canisters, or augered shafts. Increasing concerns by regulatory agencies, environmental groups, and the general public are being raised over satisfying performance objectives, particularly in shallow land burial and discharge of liquids to storage ponds. This has led to increased attention by both the technical community and public interest groups to the issues that apply to the selection of the technology for near-surface disposal of radioactive wastes. Various disposal methods are examined, focusing on performance objectives and current practices and trends

  17. Maximum entropy deconvolution of low count nuclear medicine images

    International Nuclear Information System (INIS)

    McGrath, D.M.

    1998-12-01

    Maximum entropy is applied to the problem of deconvolving nuclear medicine images, with special consideration for very low count data. The physics of the formation of scintigraphic images is described, illustrating the phenomena which degrade planar estimates of the tracer distribution. Various techniques which are used to restore these images are reviewed, outlining the relative merits of each. The development and theoretical justification of maximum entropy as an image processing technique is discussed. Maximum entropy is then applied to the problem of planar deconvolution, highlighting the question of the choice of error parameters for low count data. A novel iterative version of the algorithm is suggested which allows the errors to be estimated from the predicted Poisson mean values. This method is shown to produce the exact results predicted by combining Poisson statistics and a Bayesian interpretation of the maximum entropy approach. A facility for total count preservation has also been incorporated, leading to improved quantification. In order to evaluate this iterative maximum entropy technique, two comparable methods, Wiener filtering and a novel Bayesian maximum likelihood expectation maximisation technique, were implemented. The comparison of results obtained indicated that this maximum entropy approach may produce equivalent or better measures of image quality than the compared methods, depending upon the accuracy of the system model used. The novel Bayesian maximum likelihood expectation maximisation technique was shown to be preferable over many existing maximum a posteriori methods due to its simplicity of implementation. A single parameter is required to define the Bayesian prior, which suppresses noise in the solution and may reduce the processing time substantially. Finally, maximum entropy deconvolution was applied as a pre-processing step in single photon emission computed tomography reconstruction of low count data. Higher contrast results were

  18. Last Glacial Maximum Salinity Reconstruction

    Science.gov (United States)

    Homola, K.; Spivack, A. J.

    2016-12-01

    It has been previously demonstrated that salinity can be reconstructed from sediment porewater. The goal of our study is to reconstruct high precision salinity during the Last Glacial Maximum (LGM). Salinity is usually determined at high precision via conductivity, which requires a larger volume of water than can be extracted from a sediment core, or via chloride titration, which yields lower than ideal precision. It has been demonstrated for water column samples that high precision density measurements can be used to determine salinity at the precision of a conductivity measurement using the equation of state of seawater. However, water column seawater has a relatively constant composition, in contrast to porewater, where variations from standard seawater composition occur. These deviations, which affect the equation of state, must be corrected for through precise measurements of each ion's concentration and knowledge of apparent partial molar density in seawater. We have developed a density-based method for determining porewater salinity that requires only 5 mL of sample, achieving density precisions of 10-6 g/mL. We have applied this method to porewater samples extracted from long cores collected along a N-S transect across the western North Atlantic (R/V Knorr cruise KN223). Density was determined to a precision of 2.3x10-6 g/mL, which translates to salinity uncertainty of 0.002 gms/kg if the effect of differences in composition is well constrained. Concentrations of anions (Cl-, and SO4-2) and cations (Na+, Mg+, Ca+2, and K+) were measured. To correct salinities at the precision required to unravel LGM Meridional Overturning Circulation, our ion precisions must be better than 0.1% for SO4-/Cl- and Mg+/Na+, and 0.4% for Ca+/Na+, and K+/Na+. Alkalinity, pH and Dissolved Inorganic Carbon of the porewater were determined to precisions better than 4% when ratioed to Cl-, and used to calculate HCO3-, and CO3-2. Apparent partial molar densities in seawater were

  19. Heterogeneous inflation expectations and learning

    OpenAIRE

    Madeira, Carlos; Zafar, Basit

    2012-01-01

    Using the panel component of the Michigan Survey of Consumers, we estimate a learning model of inflation expectations, allowing for heterogeneous use of both private information and lifetime inflation experience. “Life-experience inflation” has a significant impact on individual expectations, but only for one-year-ahead inflation. Public information is substantially more relevant for longer-horizon expectations. Even controlling for life-experience inflation and public information, idiosyncra...

  20. Low background infrared (LBIR) facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Low background infrared (LBIR) facility was originally designed to calibrate user supplied blackbody sources and to characterize low-background IR detectors and...

  1. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    Volume two contains the following appendices: Description of soil sampling sites; sampling narrative; raw data soil background; background data analysis; sitewide background soil sampling plan; and use of soil background data for the detection of contamination at waste management unit on the Hanford Site

  2. Expectations on Track? High School Tracking and Adolescent Educational Expectations

    DEFF Research Database (Denmark)

    Karlson, Kristian Bernt

    2015-01-01

    This paper examines the role of adaptation in expectation formation processes by analyzing how educational tracking in high schools affects adolescents' educational expectations. I argue that adolescents view track placement as a signal about their academic abilities and respond to it in terms...... of modifying their educational expectations. Applying a difference-in-differences approach to the National Educational Longitudinal Study of 1988, I find that being placed in an advanced or honors class in high school positively affects adolescents’ expectations, particularly if placement is consistent across...... subjects and if placement contradicts tracking experiences in middle school. My findings support the hypothesis that adolescents adapt their educational expectations to ability signals sent by schools....

  3. Immigrant Students' Educational Expectations: The Role of Religious Affiliation and Practice

    Science.gov (United States)

    Hemmerechts, Kenneth; Kavadias, Dimokritos; Agirdag, Orhan

    2018-01-01

    A body of scholarly work has emerged on educational expectations. More recently, the relationship between educational expectations and immigrant background in Western Europe has been investigated. Although the results of this type of inquiry show that students with an immigrant background tend to have higher educational expectations, potential…

  4. Concerning background from calorimeter ports

    International Nuclear Information System (INIS)

    Digiacomo, N.J.

    1985-01-01

    Any detector system viewing a port or slit in a calorimeter wall will see, in addition to the primary particles of interest, a background of charged and neutral particles and photons generated by scattering from the port walls and by leakage from incompletely contained primary particle showers in the calorimeter near the port. The signal to noise ratio attainable outside the port is a complex function of the primary source spectrum, the calorimeter and port design and, of course, the nature and acceptance of the detector system that views the port. Rather than making general statements about the overall suitability (or lack thereof) of calorimeter ports, we offer here a specific example based on the external spectrometer and slit of the NA34 experiment. This combination of slit and spectrometer is designed for fixed-target work, so that the primary particle momentum spectrum contains higher momentum particles than expected in a heavy ion colliding beam environment. The results are, nevertheless, quite relevant for the collider case

  5. Northern pipelines : backgrounder

    International Nuclear Information System (INIS)

    2002-04-01

    Most analysts agree that demand for natural gas in North America will continue to grow. Favourable market conditions created by rising demand and declining production have sparked renewed interest in northern natural gas development. The 2002 Annual Energy Outlook forecasted U.S. consumption to increase at an annual average rate of 2 per cent from 22.8 trillion cubic feet to 33.8 TCF by 2020, mostly due to rapid growth in demand for electric power generation. Natural gas prices are also expected to increase at an annual average rate of 1.6 per cent, reaching $3.26 per thousand cubic feet in 2020. There are currently 3 proposals for pipelines to move northern gas to US markets. They include a stand-alone Mackenzie Delta Project, the Alaska Highway Pipeline Project, and an offshore route that would combine Alaskan and Canadian gas in a pipeline across the floor of the Beaufort Sea. Current market conditions and demand suggest that the projects are not mutually exclusive, but complimentary. The factors that differentiate northern pipeline proposals are reserves, preparedness for market, costs, engineering, and environmental differences. Canada has affirmed its role to provide the regulatory and fiscal certainty needed by industry to make investment decisions. The Government of the Yukon does not believe that the Alaska Highway Project will shut in Mackenzie Delta gas, but will instead pave the way for development of a new northern natural gas industry. The Alaska Highway Pipeline Project will bring significant benefits for the Yukon, the Northwest Territories and the rest of Canada. Unresolved land claims are one of the challenges that has to be addressed for both Yukon and the Northwest Territories, as the proposed Alaska Highway Pipeline will travel through traditional territories of several Yukon first Nations. 1 tab., 4 figs

  6. Iterative estimation of the background in noisy spectroscopic data

    International Nuclear Information System (INIS)

    Zhu, M.H.; Liu, L.G.; Cheng, Y.S.; Dong, T.K.; You, Z.; Xu, A.A.

    2009-01-01

    In this paper, we present an iterative filtering method to estimate the background of noisy spectroscopic data. The proposed method avoids the calculation of the average full width at half maximum (FWHM) of the whole spectrum and the peak regions, and it can estimate the background efficiently, especially for spectroscopic data with the Compton continuum.

  7. MEGA: A Low-Background Radiation Detector

    International Nuclear Information System (INIS)

    Kazkaz, Kareem; Aalseth, Craig E.; Hossbach, Todd W.; Gehman, Victor M.; Kephart, Jeremy; Miley, Harry S.

    2004-01-01

    The multiple-element gamma assay (MEGA) is a low-background detector designed to support environmental monitoring and national security applications. MEGA also demonstrates technology needed or Majorana, a next generation neutrino mass experiment. It will also exploit multicoincidence signatures to identify specific radioactive isotopes. MEGA is expected to begin testing in late 2003 for eventual installation at the Waste Isolation Plant, Carlsbad, NM

  8. Comparative Study of the Influence of the Home Background on ...

    African Journals Online (AJOL)

    Administrator

    between parental involvement and academic achievement of children. It was found ... was a three- page questionnaire titled “Students' Home. Background on .... higher educational expectations, enrolment in gifted and talented programs, and.

  9. Note on bouncing backgrounds

    Science.gov (United States)

    de Haro, Jaume; Pan, Supriya

    2018-05-01

    The theory of inflation is one of the fundamental and revolutionary developments of modern cosmology that became able to explain many issues of the early Universe in the context of the standard cosmological model (SCM). However, the initial singularity of the Universe, where physics is indefinite, is still obscure in the combined SCM +inflation scenario. An alternative to SCM +inflation without the initial singularity is thus always welcome, and bouncing cosmology is an attempt at that. The current work is thus motivated to investigate the bouncing solutions in modified gravity theories when the background universe is described by the spatially flat Friedmann-Lemaître-Robertson-Walker (FLRW) geometry. We show that the simplest way to obtain the bouncing cosmologies in such spacetime is to consider some kind of Lagrangian whose gravitational sector depends only on the square of the Hubble parameter of the FLRW universe. For these modified Lagrangians, the corresponding Friedmann equation, a constraint in the dynamics of the Universe, depicts a curve in the phase space (H ,ρ ), where H is the Hubble parameter and ρ is the energy density of the Universe. As a consequence, a bouncing cosmology is obtained when this curve is closed and crosses the axis H =0 at least twice, and whose simplest particular example is the ellipse depicting the well-known holonomy corrected Friedmann equation in loop quantum cosmology (LQC). Sometimes, a crucial point in such theories is the appearance of the Ostrogradski instability at the perturbative level; however, fortunately enough, in the present work, as long as the linear level of perturbations is concerned, this instability does not appear, although it may appear at the higher order of perturbations.

  10. Improved radiological/nuclear source localization in variable NORM background: An MLEM approach with segmentation data

    Energy Technology Data Exchange (ETDEWEB)

    Penny, Robert D., E-mail: robert.d.penny@leidos.com [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Crowley, Tanya M.; Gardner, Barbara M.; Mandell, Myron J.; Guo, Yanlin; Haas, Eric B.; Knize, Duane J.; Kuharski, Robert A.; Ranta, Dale; Shyffer, Ryan [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Labov, Simon; Nelson, Karl; Seilhan, Brandon [Lawrence Livermore National Laboratory, Livermore, CA (United States); Valentine, John D. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2015-06-01

    A novel approach and algorithm have been developed to rapidly detect and localize both moving and static radiological/nuclear (R/N) sources from an airborne platform. Current aerial systems with radiological sensors are limited in their ability to compensate for variable naturally occurring radioactive material (NORM) background. The proposed approach suppresses the effects of NORM background by incorporating additional information to segment the survey area into regions over which the background is likely to be uniform. The method produces pixelated Source Activity Maps (SAMs) of both target and background radionuclide activity over the survey area. The task of producing the SAMs requires (1) the development of a forward model which describes the transformation of radionuclide activity to detector measurements and (2) the solution of the associated inverse problem. The inverse problem is ill-posed as there are typically fewer measurements than unknowns. In addition the measurements are subject to Poisson statistical noise. The Maximum-Likelihood Expectation-Maximization (MLEM) algorithm is used to solve the inverse problem as it is well suited for under-determined problems corrupted by Poisson noise. A priori terrain information is incorporated to segment the reconstruction space into regions within which we constrain NORM background activity to be uniform. Descriptions of the algorithm and examples of performance with and without segmentation on simulated data are presented.

  11. Expectation-based intelligent control

    International Nuclear Information System (INIS)

    Zak, Michail

    2006-01-01

    New dynamics paradigms-negative diffusion and terminal attractors-are introduced to control noise and chaos. The applied control forces are composed of expectations governed by the associated Fokker-Planck and Liouville equations. The approach is expanded to a general concept of intelligent control via expectations. Relevance to control in livings is emphasized and illustrated by neural nets with mirror neurons

  12. Decomposing change in life expectancy

    DEFF Research Database (Denmark)

    Vaupel, James W.; Canudas Romo, Vladimir

    2003-01-01

    We extend Nathan Keyfitz's research on continuous change in life expectancy over time by presenting and proving a new formula for decomposing such change. The formula separates change in life expectancy over time into two terms. The first term captures the general effect of reduction in death rates...... in Sweden and Japan....

  13. Sibling Status Effects: Adult Expectations.

    Science.gov (United States)

    Baskett, Linda Musun

    1985-01-01

    This study attempted to determine what expectations or beliefs adults might hold about a child based on his or her sibling status alone. Ratings on 50 adjective pairs for each of three sibling status types, only, oldest, and youngest child, were assessed in relation to adult expectations, birth order, and parental status of rater. (Author/DST)

  14. Brain systems underlying encounter expectancy bias in spider phobia.

    Science.gov (United States)

    Aue, Tatjana; Hoeppli, Marie-Eve; Piguet, Camille; Hofstetter, Christoph; Rieger, Sebastian W; Vuilleumier, Patrik

    2015-06-01

    Spider-phobic individuals are characterized by exaggerated expectancies to be faced with spiders (so-called encounter expectancy bias). Whereas phobic responses have been linked to brain systems mediating fear, little is known about how the recruitment of these systems relates to exaggerated expectancies of threat. We used fMRI to examine spider-phobic and control participants while they imagined visiting different locations in a forest after having received background information about the likelihood of encountering different animals (spiders, snakes, and birds) at these locations. Critically, imagined encounter expectancies modulated brain responses differently in phobics as compared with controls. Phobics displayed stronger negative modulation of activity in the lateral prefrontal cortex, precuneus, and visual cortex by encounter expectancies for spiders, relative to snakes or birds (within-participants analysis); these effects were not seen in controls. Between-participants correlation analyses within the phobic group further corroborated the hypothesis that these phobia-specific modulations may underlie irrationality in encounter expectancies (deviations of encounter expectancies from objective background information) in spider phobia; the greater the negative modulation a phobic participant displayed in the lateral prefrontal cortex, precuneus, and visual cortex, the stronger was her bias in encounter expectancies for spiders. Interestingly, irrationality in expectancies reflected in frontal areas relied on right rather than left hemispheric deactivations. Our data accord with the idea that expectancy biases in spider phobia may reflect deficiencies in cognitive control and contextual integration that are mediated by right frontal and parietal areas.

  15. Mass mortality of the vermetid gastropod Ceraesignum maximum

    Science.gov (United States)

    Brown, A. L.; Frazer, T. K.; Shima, J. S.; Osenberg, C. W.

    2016-09-01

    Ceraesignum maximum (G.B. Sowerby I, 1825), formerly Dendropoma maximum, was subject to a sudden, massive die-off in the Society Islands, French Polynesia, in 2015. On Mo'orea, where we have detailed documentation of the die-off, these gastropods were previously found in densities up to 165 m-2. In July 2015, we surveyed shallow back reefs of Mo'orea before, during and after the die-off, documenting their swift decline. All censused populations incurred 100% mortality. Additional surveys and observations from Mo'orea, Tahiti, Bora Bora, and Huahine (but not Taha'a) suggested a similar, and approximately simultaneous, die-off. The cause(s) of this cataclysmic mass mortality are currently unknown. Given the previously documented negative effects of C. maximum on corals, we expect the die-off will have cascading effects on the reef community.

  16. Two-dimensional maximum entropy image restoration

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.

    1977-07-01

    An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures

  17. What controls the maximum magnitude of injection-induced earthquakes?

    Science.gov (United States)

    Eaton, D. W. S.

    2017-12-01

    Three different approaches for estimation of maximum magnitude are considered here, along with their implications for managing risk. The first approach is based on a deterministic limit for seismic moment proposed by McGarr (1976), which was originally designed for application to mining-induced seismicity. This approach has since been reformulated for earthquakes induced by fluid injection (McGarr, 2014). In essence, this method assumes that the upper limit for seismic moment release is constrained by the pressure-induced stress change. A deterministic limit is given by the product of shear modulus and the net injected fluid volume. This method is based on the assumptions that the medium is fully saturated and in a state of incipient failure. An alternative geometrical approach was proposed by Shapiro et al. (2011), who postulated that the rupture area for an induced earthquake falls entirely within the stimulated volume. This assumption reduces the maximum-magnitude problem to one of estimating the largest potential slip surface area within a given stimulated volume. Finally, van der Elst et al. (2016) proposed that the maximum observed magnitude, statistically speaking, is the expected maximum value for a finite sample drawn from an unbounded Gutenberg-Richter distribution. These three models imply different approaches for risk management. The deterministic method proposed by McGarr (2014) implies that a ceiling on the maximum magnitude can be imposed by limiting the net injected volume, whereas the approach developed by Shapiro et al. (2011) implies that the time-dependent maximum magnitude is governed by the spatial size of the microseismic event cloud. Finally, the sample-size hypothesis of Van der Elst et al. (2016) implies that the best available estimate of the maximum magnitude is based upon observed seismicity rate. The latter two approaches suggest that real-time monitoring is essential for effective management of risk. A reliable estimate of maximum

  18. Testing exact rational expectations in cointegrated vector autoregressive models

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    1999-01-01

    This paper considers the testing of restrictions implied by rational expectations hypotheses in a cointegrated vector autoregressive model for I(1) variables. If the rational expectations involve one-step-ahead observations only and the coefficients are known, an explicit parameterization...... of the restrictions is found, and the maximum-likelihood estimator is derived by regression and reduced rank regression. An application is given to a present value model....

  19. Background modeling for the GERDA experiment

    Science.gov (United States)

    Becerici-Schmidt, N.; Gerda Collaboration

    2013-08-01

    The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Qββ come from 214Bi, 228Th, 42K, 60Co and α emitting isotopes in the 226Ra decay chain, with a fraction depending on the assumed source positions.

  20. Background modeling for the GERDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Becerici-Schmidt, N. [Max-Planck-Institut für Physik, München (Germany); Collaboration: GERDA Collaboration

    2013-08-08

    The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Q{sub ββ} come from {sup 214}Bi, {sup 228}Th, {sup 42}K, {sup 60}Co and α emitting isotopes in the {sup 226}Ra decay chain, with a fraction depending on the assumed source positions.

  1. Estimated radiological doses to the maximumly exposed individual and downstream populations from releases of tritium, strontium-90, ruthenium-106, and cesium-137 from White Oak Dam

    International Nuclear Information System (INIS)

    Little, C.A.; Cotter, S.J.

    1980-01-01

    Concentrations of tritium, 90 Sr, 106 Ru, and 137 Cs in the Clinch River for 1978 were estimated by using the known 1978 releases of these nuclides from the White Oak Dam and diluting them by the integrated annual flow rate of the Clinch River. Estimates of 50-year dose commitment to a maximumly exposed individual were calculated for both aquatic and terestrial pathways of exposure. The maximumly exposed individual was assumed to reside at the mouth of White Oak Creek where it enters the Clinch River and obtain all foodstuffs and drinking water at that location. The estimated total-body dose from all pathways to the maximumly exposed individual as a result of 1978 releases was less than 1% of the dose expected from natural background. Using appropriate concentrations of to subject radionuclides diluted downstream, the doses to populations residing at Harriman, Kingston, Rockwood, Spring City, Soddy-Daisy, and Chattanooga were calculated for aquatic exposure pathways. The total-body dose estimated for aquatic pathways for the six cities was about 0.0002 times the expected dose from natural background. For the pathways considered in this report, the nuclide which contributed the largest fraction of dose was 90 Sr. The largest dose delivered by 90 Sr was to the bone of the subject individual or community

  2. Neural correlates of rhythmic expectancy

    Directory of Open Access Journals (Sweden)

    Theodore P. Zanto

    2006-01-01

    Full Text Available Temporal expectancy is thought to play a fundamental role in the perception of rhythm. This review summarizes recent studies that investigated rhythmic expectancy by recording neuroelectric activity with high temporal resolution during the presentation of rhythmic patterns. Prior event-related brain potential (ERP studies have uncovered auditory evoked responses that reflect detection of onsets, offsets, sustains,and abrupt changes in acoustic properties such as frequency, intensity, and spectrum, in addition to indexing higher-order processes such as auditory sensory memory and the violation of expectancy. In our studies of rhythmic expectancy, we measured emitted responses - a type of ERP that occurs when an expected event is omitted from a regular series of stimulus events - in simple rhythms with temporal structures typical of music. Our observations suggest that middle-latency gamma band (20-60 Hz activity (GBA plays an essential role in auditory rhythm processing. Evoked (phase-locked GBA occurs in the presence of physically presented auditory events and reflects the degree of accent. Induced (non-phase-locked GBA reflects temporally precise expectancies for strongly and weakly accented events in sound patterns. Thus far, these findings support theories of rhythm perception that posit temporal expectancies generated by active neural processes.

  3. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  4. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  5. Maximum Power from a Solar Panel

    Directory of Open Access Journals (Sweden)

    Michael Miller

    2010-01-01

    Full Text Available Solar energy has become a promising alternative to conventional fossil fuel sources. Solar panels are used to collect solar radiation and convert it into electricity. One of the techniques used to maximize the effectiveness of this energy alternative is to maximize the power output of the solar collector. In this project the maximum power is calculated by determining the voltage and the current of maximum power. These quantities are determined by finding the maximum value for the equation for power using differentiation. After the maximum values are found for each time of day, each individual quantity, voltage of maximum power, current of maximum power, and maximum power is plotted as a function of the time of day.

  6. Executive Summary - Historical background

    International Nuclear Information System (INIS)

    2005-01-01

    matter physics experiments at the High Flux Reactor of The Laue Langevin Institute and the ISIS spallation source at Rutherford-Appleton. Recently, we very actively entered the ICARUS neutrino collaboration and were invited to the PIERRE AUGER collaboration which will search for the highest energies in the Universe. Having close ties with CERN we are very actively engaged in CROSS-GRID, a large computer network project. To better understand the historical background of the INP development, it is necessary to add a few comments on financing of science in Poland. During the 70's and the 80's, research was financed through the so-called Central Research Projects for Science and Technical Development. The advantage of this system was that state-allocated research funds were divided only by a few representatives of the scientific community, which allowed realistic allocation of money to a small number of projects. After 1989 we were able to purchase commercially available equipment, which led to the closure of our large and very experienced electronic workshop. We also considerably reduced our well equipped mechanical shop. During the 90's the reduced state financing of science was accompanied by a newly established Committee of Scientific Research which led to the creation of a system of small research projects. This precluded the development of more ambitious research projects and led to the dispersion of equipment among many smaller laboratories and universities. A large research establishment, such as our Institute, could not develop properly under such conditions. In all, between 1989 and 2004 we reduced our personnel from about 800 to 470 and our infrastructure became seriously undercapitalised. However, with energetic search for research funds, from European rather than national research programs, we hope to improve and modernize our laboratories and their infrastructure in the coming years

  7. Reliability and concurrent validity of the Dutch hip and knee replacement expectations surveys

    NARCIS (Netherlands)

    van den Akker-Scheek, Inge; van Raay, Jos J. A. M.; Reininga, Inge H. F.; Bulstra, Sjoerd K.; Zijlstra, Wiebren; Stevens, Martin

    2010-01-01

    Background: Preoperative expectations of outcome of total hip and knee arthroplasty are important determinants of patients' satisfaction and functional outcome. Aims of the study were (1) to translate the Hospital for Special Surgery Hip Replacement Expectations Survey and Knee Replacement

  8. Dialysis centers - what to expect

    Science.gov (United States)

    ... kidneys - dialysis centers; Dialysis - what to expect; Renal replacement therapy - dialysis centers; End-stage renal disease - dialysis ... to a tube that connects to the dialysis machine. Your blood will flow through the tube, into ...

  9. Life expectancy in bipolar disorder

    DEFF Research Database (Denmark)

    Kessing, Lars Vedel; Vradi, Eleni; Andersen, Per Kragh

    2015-01-01

    OBJECTIVE: Life expectancy in patients with bipolar disorder has been reported to be decreased by 11 to 20 years. These calculations are based on data for individuals at the age of 15 years. However, this may be misleading for patients with bipolar disorder in general as most patients have a later...... onset of illness. The aim of the present study was to calculate the remaining life expectancy for patients of different ages with a diagnosis of bipolar disorder. METHODS: Using nationwide registers of all inpatient and outpatient contacts to all psychiatric hospitals in Denmark from 1970 to 2012 we...... remaining life expectancy in bipolar disorder and that of the general population decreased with age, indicating that patients with bipolar disorder start losing life-years during early and mid-adulthood. CONCLUSIONS: Life expectancy in bipolar disorder is decreased substantially, but less so than previously...

  10. FastStats: Life Expectancy

    Science.gov (United States)

    ... What's this? Submit What's this? Submit Button NCHS Home ... expectancy at birth, at 65, and 75 years of age by sex, race and Hispanic origin Health, United States 2016, table 15 [PDF – 9.8 MB] Life ...

  11. Physical activity extends life expectancy

    Science.gov (United States)

    Leisure-time physical activity is associated with longer life expectancy, even at relatively low levels of activity and regardless of body weight, according to a study by a team of researchers led by the NCI.

  12. Subjective expected utility without preferences

    OpenAIRE

    Bouyssou , Denis; Marchant , Thierry

    2011-01-01

    This paper proposes a theory of subjective expected utility based on primitives only involving the fact that an act can be judged either "attractive" or "unattractive". We give conditions implying that there are a utility function on the set of consequences and a probability distribution on the set of states such that attractive acts have a subjective expected utility above some threshold. The numerical representation that is obtained has strong uniqueness properties.

  13. A maximum likelihood framework for protein design

    Directory of Open Access Journals (Sweden)

    Philippe Hervé

    2006-06-01

    Full Text Available Abstract Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces

  14. Simultaneous alcohol and cannabis expectancies predict simultaneous use

    Directory of Open Access Journals (Sweden)

    Earleywine Mitch

    2006-10-01

    Full Text Available Abstract Background Simultaneous use of alcohol and cannabis predicts increased negative consequences for users beyond individual or even concurrent use of the two drugs. Given the widespread use of the drugs and common simultaneous consumption, problems unique to simultaneous use may bear important implications for many substance users. Cognitive expectancies offer a template for future drug use behavior based on previous drug experiences, accurately predicting future use and problems. Studies reveal similar mechanisms underlying both alcohol and cannabis expectancies, but little research examines simultaneous expectancies for alcohol and cannabis use. Whereas research has demonstrated unique outcomes associated with simultaneous alcohol and cannabis use, this study hypothesized that unique cognitive expectancies may underlie simultaneous alcohol and cannabis use. Results: This study examined a sample of 2600 (66% male; 34% female Internet survey respondents solicited through advertisements with online cannabis-related organizations. The study employed known measures of drug use and expectancies, as well as a new measure of simultaneous drug use expectancies. Expectancies for simultaneous use of alcohol and cannabis predicted simultaneous use over and above expectancies for each drug individually. Discussion Simultaneous expectancies may provide meaningful information not available with individual drug expectancies. These findings bear potential implications on the assessment and treatment of substance abuse problems, as well as researcher conceptualizations of drug expectancies. Policies directing the treatment of substance abuse and its funding ought to give unique consideration to simultaneous drug use and its cognitive underlying factors.

  15. Radiation background with the CMS RPCs at the LHC

    CERN Document Server

    Costantini, Silvia; Cai, J.; Li, Q.; Liu, S.; Qian, S.; Wang, D.; Xu, Z.; Zhang, F.; Choi, Y.; Goh, J.; Kim, D.; Choi, S.; Hong, B.; Kang, J.W.; Kang, M.; Kwon, J.H.; Lee, K.S.; Lee, S.K.; Park, S.K.; Pant, L.M.; Mohanty, A.K.; Chudasama, R.; Singh, J.B.; Bhatnagar, V.; Mehta, A.; Kumar, R.; Cauwenbergh, S.; Cimmino, A.; Crucy, S.; Fagot, A.; Garcia, G.; Ocampo, A.; Poyraz, D.; Salva, S.; Thyssen, F.; Tytgat, M.; Zaganidis, N.; Doninck, W.V.; Cabrera, A.; Chaparro, L.; Gomez, J.P.; Gomez, B.; Sanabria, J.C.; Avila, C.; Ahmad, A.; Muhammad, S.; Shoaib, M.; Hoorani, H.; Awan, I.; Ali, I.; Ahmed, W.; Asghar, M.I.; Shahzad, H.; Sayed, A.; Ibrahim, A.; Aly, S.; Assran, Y.; Radi, A.; Elkafrawy, T.; Sharma, A.; Colafranceschi, S.; Abbrescia, M.; Calabria, C.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Nuzzo, S.; Pugliese, G.; Radogna, R.; Venditti, R.; Verwilligen, P.; Benussi, L.; Bianco, S.; Piccolo, D.; Paolucci, P.; Buontempo, S.; Cavallo, N.; Merola, M.; Fabozzi, F.; Iorio, O.M.; Braghieri, A.; Montagna, P.; Riccardi, C.; Salvini, P.; Vitulo, P.; Vai, I.; Magnani, A.; Dimitrov, A.; Litov, L.; Pavlov, B.; Petkov, P.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Rodozov, M.; Sultanov, G.; Vutova, M.; Stoykova, S.; Hadjiiska, R.; Ibargüen, H.S.; Morales, M.I.P.; Bernardino, S.C.; Bagaturia, I.; Tsamalaidze, Z.; Crotty, I.; Kim, M.S.

    2015-05-28

    The Resistive Plate Chambers (RPCs) are employed in the CMS experiment at the LHC as dedicated trigger system both in the barrel and in the endcap. This note presents results of the radiation background measurements performed with the 2011 and 2012 proton-proton collision data collected by CMS. Emphasis is given to the measurements of the background distribution inside the RPCs. The expected background rates during the future running of the LHC are estimated both from extrapolated measurements and from simulation.

  16. Background noise levels in Europe

    OpenAIRE

    Gjestland, Truls

    2008-01-01

    - This report gives a brief overview of typical background noise levels in Europe, and suggests a procedure for the prediction of background noise levels based on population density. A proposal for the production of background noise maps for Europe is included.

  17. Background reduction in the SNO+ experiment

    Energy Technology Data Exchange (ETDEWEB)

    Segui, L. [University of Oxford, Denys Wilkinson Building, Keble Road, OX1 Oxford (United Kingdom)

    2015-08-17

    SNO+ is a large multi-purpose liquid scintillator experiment, which first aim is to detect the neutrinoless double beta decay of {sup 130}Te. It is placed at SNOLAB, at 6000 m.w.e. and it is based on the SNO infrastructure. SNO+ will contain approximately 780 tonnes of liquid scintillator, loaded with {sup 130}Te inside an acrylic vessel (AV) with an external volume of ultra pure water to reduce the external backgrounds. Light produced in the scintillator by the interaction of particles will be detected with about 9,000 photomultiplier’s. For the neutrinoless double beta decay phase, due to its the extremely low rate expected, the control, knowledge and reduction of the background is essential. Moreover, it will also benefit other phases of the experiment focused on the study of solar neutrinos, nucleon decay, geoneutrinos and supernovae. In order to reduce the internal background level, a novel purification technique for tellurium loaded scintillators has been developed by the collaboration that reduces the U/Th concentration and several cosmic-activated isotopes by at least a factor 10{sup 2} -10{sup 3} in a single pass. In addition, different rejection techniques have been developed for the remaining internal backgrounds based on Monte-Carlo simulations. In this work, the scintillator purification technique and the levels obtained with it will be discussed. Furthermore, an overview of the different backgrounds for the double-beta phase will be presented, highlighting some of the techniques developed to reject the remained decays based on their expected timing differences.

  18. Identification of simulated microcalcifications in white noise and mammographic backgrounds

    International Nuclear Information System (INIS)

    Reiser, Ingrid; Nishikawa, Robert M.

    2006-01-01

    This work investigates human performance in discriminating between differently shaped simulated microcalcifications embedded in white noise or mammographic backgrounds. Human performance was determined through two alternative forced-choice (2-AFC) experiments. The signals used were computer-generated simple shapes that were designed such that they had equal signal energy. This assured equal detectability. For experiments involving mammographic backgrounds, signals were blurred to account for the imaging system modulation transfer function (MTF). White noise backgrounds were computer generated; anatomic background patches were extracted from normal mammograms. We compared human performance levels as a function of signal energy in the expected difference template. In the discrimination task, the expected difference template is the difference between the two signals shown. In white noise backgrounds, human performance in the discrimination task was degraded compared to the detection task. In mammographic backgrounds, human performance in the discrimination task exceeded that of the detection task. This indicates that human observers do not follow the optimum decision strategy of correlating the expected signal template with the image. Human observer performance was qualitatively reproduced by non-prewhitening with eye filter (NPWE) model observer calculations, in which spatial uncertainty was explicitly included by shifting the locations of the expected difference templates. The results indicate that human strategy in the discrimination task may be to match individual signal templates with the image individually, rather than to perform template matching between the expected difference template and the image

  19. Natural background radiation in Saudi Arabia

    International Nuclear Information System (INIS)

    Al-Hussan, K.A.; Al-Suliman, K.M.; Wafa, N.F.

    1993-01-01

    Natural background radiation measurements have been made at numerous locations throughout the world. Little work in this field has been done in developing countries. In this study, the external exposure rates due to natural background radiation sources have been measured for different Saudi Arabian cities. Thermoluminescence dosimeters, CaF 2 Dy(TLD-200), has been used for field measurements. Exposure to TLD's response correlations were obtained for each TLD using a 137 Cs source. A correlation of TLD's response fading at a continuous radiation exposure environment was obtained and applied to correct field measurements. The measurements were taken every two months for a total of six intervals during the whole year. The average measurements of outdoor external exposure rates was found to vary between a minimum of 5.29 μR h -1 in Dammam city and a maximum of 11.59 μR h -1 in Al-Khamis city. (1 fig., 1 tab.)

  20. Broken Expectations: Violation of Expectancies, Not Novelty, Captures Auditory Attention

    Science.gov (United States)

    Vachon, Francois; Hughes, Robert W.; Jones, Dylan M.

    2012-01-01

    The role of memory in behavioral distraction by auditory attentional capture was investigated: We examined whether capture is a product of the novelty of the capturing event (i.e., the absence of a recent memory for the event) or its violation of learned expectancies on the basis of a memory for an event structure. Attentional capture--indicated…

  1. Maximum permissible voltage of YBCO coated conductors

    Energy Technology Data Exchange (ETDEWEB)

    Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z. [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Hong, Z., E-mail: zhiyong.hong@sjtu.edu.cn [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang, D.; Zhou, H.; Shen, X.; Shen, C. [Qingpu Power Supply Company, State Grid Shanghai Municipal Electric Power Company, Shanghai (China)

    2014-06-15

    Highlights: • We examine three kinds of tapes’ maximum permissible voltage. • We examine the relationship between quenching duration and maximum permissible voltage. • Continuous I{sub c} degradations under repetitive quenching where tapes reaching maximum permissible voltage. • The relationship between maximum permissible voltage and resistance, temperature. - Abstract: Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (I{sub c}) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the I{sub c} degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.

  2. The background in the experiment Gerda

    Science.gov (United States)

    Agostini, M.; Allardt, M.; Andreotti, E.; Bakalyarov, A. M.; Balata, M.; Barabanov, I.; Barnabé Heider, M.; Barros, N.; Baudis, L.; Bauer, C.; Becerici-Schmidt, N.; Bellotti, E.; Belogurov, S.; Belyaev, S. T.; Benato, G.; Bettini, A.; Bezrukov, L.; Bode, T.; Brudanin, V.; Brugnera, R.; Budjáš, D.; Caldwell, A.; Cattadori, C.; Chernogorov, A.; Cossavella, F.; Demidova, E. V.; Domula, A.; Egorov, V.; Falkenstein, R.; Ferella, A.; Freund, K.; Frodyma, N.; Gangapshev, A.; Garfagnini, A.; Gotti, C.; Grabmayr, P.; Gurentsov, V.; Gusev, K.; Guthikonda, K. K.; Hampel, W.; Hegai, A.; Heisel, M.; Hemmer, S.; Heusser, G.; Hofmann, W.; Hult, M.; Inzhechik, L. V.; Ioannucci, L.; Csáthy, J. Janicskó; Jochum, J.; Junker, M.; Kihm, T.; Kirpichnikov, I. V.; Kirsch, A.; Klimenko, A.; Knöpfle, K. T.; Kochetov, O.; Kornoukhov, V. N.; Kuzminov, V. V.; Laubenstein, M.; Lazzaro, A.; Lebedev, V. I.; Lehnert, B.; Liao, H. Y.; Lindner, M.; Lippi, I.; Liu, X.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Macolino, C.; Machado, A. A.; Majorovits, B.; Maneschg, W.; Nemchenok, I.; Nisi, S.; O'Shaughnessy, C.; Palioselitis, D.; Pandola, L.; Pelczar, K.; Pessina, G.; Pullia, A.; Riboldi, S.; Sada, C.; Salathe, M.; Schmitt, C.; Schreiner, J.; Schulz, O.; Schwingenheuer, B.; Schönert, S.; Shevchik, E.; Shirchenko, M.; Simgen, H.; Smolnikov, A.; Stanco, L.; Strecker, H.; Tarka, M.; Ur, C. A.; Vasenko, A. A.; Volynets, O.; von Sturm, K.; Wagner, V.; Walter, M.; Wegmann, A.; Wester, T.; Wojcik, M.; Yanovich, E.; Zavarise, P.; Zhitnikov, I.; Zhukov, S. V.; Zinatulina, D.; Zuber, K.; Zuzel, G.

    2014-04-01

    The GERmanium Detector Array ( Gerda) experiment at the Gran Sasso underground laboratory (LNGS) of INFN is searching for neutrinoless double beta () decay of Ge. The signature of the signal is a monoenergetic peak at 2039 keV, the value of the decay. To avoid bias in the signal search, the present analysis does not consider all those events, that fall in a 40 keV wide region centered around . The main parameters needed for the analysis are described. A background model was developed to describe the observed energy spectrum. The model contains several contributions, that are expected on the basis of material screening or that are established by the observation of characteristic structures in the energy spectrum. The model predicts a flat energy spectrum for the blinding window around with a background index ranging from 17.6 to 23.8 cts/(keV kg yr). A part of the data not considered before has been used to test if the predictions of the background model are consistent. The observed number of events in this energy region is consistent with the background model. The background at is dominated by close sources, mainly due to K, Bi, Th, Co and emitting isotopes from the Ra decay chain. The individual fractions depend on the assumed locations of the contaminants. It is shown, that after removal of the known peaks, the energy spectrum can be fitted in an energy range of 200 keV around with a constant background. This gives a background index consistent with the full model and uncertainties of the same size.

  3. Consumer's inflation expectations in Brazil

    Directory of Open Access Journals (Sweden)

    Fernando Ormonde Teixeira

    Full Text Available Abstract This paper investigates what are the main components of consumer's inflation expectations. We combine the FGV's Consumer Survey with the indices of inflation (IPCA and government regulated prices, professional forecasts disclosed in the Focus report, and media data which we crawl from one of the biggest and most important Brazilian newspapers, Folha de São Paulo, to determine what factors are responsible for and improve consumer's forecast accuracy. We found gender, age and city of residence as major elements when analyzing micro-data. Aggregate data shows the past inflation as an important trigger in the formation of consumers' expectations and professional forecasts as negligible. Moreover, the media plays a significant role, accounting not only for the expectations' formation but for a better understanding of actual inflation as well.

  4. Test expectancy affects metacomprehension accuracy.

    Science.gov (United States)

    Thiede, Keith W; Wiley, Jennifer; Griffin, Thomas D

    2011-06-01

    Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and practice tests. The purpose of the present study was to examine whether the accuracy metacognitive monitoring was affected by the nature of the test expected. Students (N= 59) were randomly assigned to one of two test expectancy groups (memory vs. inference). Then after reading texts, judging learning, completed both memory and inference tests. Test performance and monitoring accuracy were superior when students received the kind of test they had been led to expect rather than the unexpected test. Tests influence students' perceptions of what constitutes learning. Our findings suggest that this could affect how students prepare for tests and how they monitoring their own learning. ©2010 The British Psychological Society.

  5. Expectations for a scientific collaboratory

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    2003-01-01

    In the past decade, a number of scientific collaboratories have emerged, yet adoption of scientific collaboratories remains limited. Meeting expectations is one factor that influences adoption of innovations, including scientific collaboratories. This paper investigates expectations scientists have...... with respect to scientific collaboratories. Interviews were conducted with 17 scientists who work in a variety of settings and have a range of experience conducting and managing scientific research. Results indicate that scientists expect a collaboratory to: support their strategic plans; facilitate management...... of the scientific process; have a positive or neutral impact on scientific outcomes; provide advantages and disadvantages for scientific task execution; and provide personal conveniences when collaborating across distances. These results both confirm existing knowledge and raise new issues for the design...

  6. Computing the Maximum Detour of a Plane Graph in Subquadratic Time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    Let G be a plane graph where each edge is a line segment. We consider the problem of computing the maximum detour of G, defined as the maximum over all pairs of distinct points p and q of G of the ratio between the distance between p and q in G and the distance |pq|. The fastest known algorithm f...... for this problem has O(n^2) running time. We show how to obtain O(n^{3/2}*(log n)^3) expected running time. We also show that if G has bounded treewidth, its maximum detour can be computed in O(n*(log n)^3) expected time....

  7. Natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was small compared to the natural background. In order to make this concept quantitative and objective, we propose that small compared with the natural background be interpreted as the standard deviation (weighted with the exposed population) of the natural background. This use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the United States, it suggests that a dose of 20 mrem/year would be an acceptable standard. This is comparable to the 25 mrem/year suggested as the maximum allowable exposure to an individual from the complete uranium fuel cycle

  8. Backgrounder

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    Center for Mountain Ecosystem Studies, Kunming Institute of Botany of the Chinese Academy of Sciences, China: $1,526,000 to inform effective water governance in the Asian highlands of China, Nepal, and Pakistan. • Ashoka Trust for Research in Ecology and the Environment (ATREE), India: $1,499,300 for research on ...

  9. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    demographic trends, socio-economic development pathways, and strong ... knowledge and experience, and encourage innovation. ... choices, and will work with stakeholders in government, business, civil society, and regional economic.

  10. Backgrounder

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    Safe and Inclusive Cities: ... improving urban environments and public spaces might have on reducing the city's high ... violence against women among urban youth of working class neighbourhoods of Islamabad, Rawalpindi, and Karachi,.

  11. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    CARIAA's research agenda addresses gaps and priorities highlighted in the ... Research focuses on climate risk, institutional and regulatory frameworks, markets, and ... The researchers will identify relevant drivers and trends and use develop ...

  12. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    achieving long‐term food security in Africa, with a focus on post‐harvest loss, ... nutrion and health, and the socio‐economic factors that affect food supply ... Water use. Agricultural producvity in sub‐Saharan Africa is the lowest in the world.

  13. Epidemiological studies in high background radiation areas

    International Nuclear Information System (INIS)

    Akiba, Suminori

    2012-01-01

    Below the doses of 100-200 mSv of radiation exposure, no acute health effect is observed, and the late health effects such as cancer are yet unclear. The problems making the risk evaluation of low dose radiation exposure difficult are the fact that the magnitude of expected health effects are small even if the risk is assumed to increase in proportion to radiation doses. As a result, studies need to be large particular when dealing with rare disease such as cancer. In addition, the expected health effects are so small that they can easily be masked by lifestyles and environmental factors including smoking. This paper will discuss cancer risk possibly associated with low-dose and low-dose rate radiation exposure, describing epidemiological studies on the residents in the high-background radiation areas. (author)

  14. Revealing the Maximum Strength in Nanotwinned Copper

    DEFF Research Database (Denmark)

    Lu, L.; Chen, X.; Huang, Xiaoxu

    2009-01-01

    boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...

  15. Modelling maximum canopy conductance and transpiration in ...

    African Journals Online (AJOL)

    There is much current interest in predicting the maximum amount of water that can be transpired by Eucalyptus trees. It is possible that industrial waste water may be applied as irrigation water to eucalypts and it is important to predict the maximum transpiration rates of these plantations in an attempt to dispose of this ...

  16. Estimating the maximum potential revenue for grid connected electricity storage :

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, Raymond Harry; Silva Monroy, Cesar Augusto.

    2012-12-01

    The valuation of an electricity storage device is based on the expected future cash flow generated by the device. Two potential sources of income for an electricity storage system are energy arbitrage and participation in the frequency regulation market. Energy arbitrage refers to purchasing (stor- ing) energy when electricity prices are low, and selling (discharging) energy when electricity prices are high. Frequency regulation is an ancillary service geared towards maintaining system frequency, and is typically procured by the independent system operator in some type of market. This paper outlines the calculations required to estimate the maximum potential revenue from participating in these two activities. First, a mathematical model is presented for the state of charge as a function of the storage device parameters and the quantities of electricity purchased/sold as well as the quantities o ered into the regulation market. Using this mathematical model, we present a linear programming optimization approach to calculating the maximum potential revenue from an elec- tricity storage device. The calculation of the maximum potential revenue is critical in developing an upper bound on the value of storage, as a benchmark for evaluating potential trading strate- gies, and a tool for capital nance risk assessment. Then, we use historical California Independent System Operator (CAISO) data from 2010-2011 to evaluate the maximum potential revenue from the Tehachapi wind energy storage project, an American Recovery and Reinvestment Act of 2009 (ARRA) energy storage demonstration project. We investigate the maximum potential revenue from two di erent scenarios: arbitrage only and arbitrage combined with the regulation market. Our analysis shows that participation in the regulation market produces four times the revenue compared to arbitrage in the CAISO market using 2010 and 2011 data. Then we evaluate several trading strategies to illustrate how they compare to the

  17. Career Expectations of Accounting Students

    Science.gov (United States)

    Elam, Dennis; Mendez, Francis

    2010-01-01

    The demographic make-up of accounting students is dramatically changing. This study sets out to measure how well the profession is ready to accommodate what may be very different needs and expectations of this new generation of students. Non-traditional students are becoming more and more of a tradition in the current college classroom.…

  18. Primary expectations of secondary metabolites

    Science.gov (United States)

    My program examines the plant secondary metabolites (i.e. phenolics) important for human health, and which impart the organoleptic properties that are quality indicators for fresh and processed foods. Consumer expectations such as appearance, taste, or texture influence their purchasing decisions; a...

  19. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  20. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  1. Effect of Ovality on Maximum External Pressure of Helically Coiled Steam Generator Tubes with a Rectangular Wear

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dong In; Lim, Eun Mo; Huh, Nam Su [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of); Choi, Shin Beom; Yu, Je Yong; Kim, Ji Ho; Choi, Suhn [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    A structural integrity of steam generator tubes of nuclear power plants is one of crucial parameters for safe operation of nuclear power plants. Thus, many studies have been made to provide engineering methods to assess integrity of defective tubes of commercial nuclear power plants considering its operating environments and defect characteristics. As described above, the geometric and operating conditions of steam generator tubes in integral reactor are significantly different from those of commercial reactor. Therefore, the structural integrity assessment of defective tubes of integral reactor taking into account its own operating conditions and geometric characteristics, i. e., external pressure and helically coiled shape, should be made to demonstrate compliance with the current design criteria. Also, ovality is very specific characteristics of the helically coiled tube because it is occurred during the coiling processes. The wear, occurring from FIV (Flow Induced Vibration) and so on, is main degradation of steam generator tube. In the present study, maximum external pressure of helically coiled steam generator tube with wear is predicted based on the detailed 3-dimensional finite element analysis. As for shape of wear defect, the rectangular shape is considered. In particular, the effect of ovality on the maximum external pressure of helically coiled tubes with rectangular shaped wear is investigated. In the present work, the maximum external pressure of helically coiled steam generator tube with rectangular shaped wear is investigated via detailed 3-D FE analyses. In order to cover a practical range of geometries for defective tube, the variables affecting the maximum external pressure were systematically varied. In particular, the effect of tube ovality on the maximum external pressure is evaluated. It is expected that the present results can be used as a technical backgrounds for establishing a practical structural integrity assessment guideline of

  2. Detecting the Stochastic Gravitational-Wave Background

    Science.gov (United States)

    Colacino, Carlo Nicola

    2017-12-01

    The stochastic gravitational-wave background (SGWB) is by far the most difficult source of gravitational radiation detect. At the same time, it is the most interesting and intriguing one. This book describes the initial detection of the SGWB and describes the underlying mathematics behind one of the most amazing discoveries of the 21st century. On the experimental side it would mean that interferometric gravitational wave detectors work even better than expected. On the observational side, such a detection could give us information about the very early Universe, information that could not be obtained otherwise. Even negative results and improved upper bounds could put constraints on many cosmological and particle physics models.

  3. Fluctuations in the cosmic microwave background

    International Nuclear Information System (INIS)

    Banday, A.J.; Wolfendale, A.W.

    1990-01-01

    In view of the importance to contemporary cosmology, and to our understanding of the Universe, of the precise nature of the Cosmic Microwave Background (CMB) spectrum, we consider the effects on this spectrum of contamination by other radiation fields of both galactic and extragalactic origin. Particular attention is given to the significance of measurements of the fluctuations in the 'background' radiation detected at 10.46 GHz and we conclude that these fluctuations are of the same magnitude as those expected from galactic cosmic-ray effects. A more detailed study of the cosmic-ray induced fluctuations and measurements at higher frequencies will be needed before genuine CMB fluctuations can be claimed. (author)

  4. A Detector for Cosmic Microwave Background Polarimetry

    Science.gov (United States)

    Wollack, E.; Cao, N.; Chuss, D.; Hsieh, W.-T.; Moseley, S. Harvey; Stevenson, T.; U-yen, K.

    2008-01-01

    We present preliminary design and development work on polarized detectors intended to enable Cosmic Microwave Background polarization measurements that will probe the first moments of the universe. The ultimate measurement will be challenging, requiring background-limited detectors and good control of systematic errors. Toward this end, we are integrating the beam control of HE-11 feedhorns with the sensitivity of transition-edge sensors. The coupling between these two devices is achieved via waveguide probe antennas and superconducting microstrip lines. This implementation allows band-pass filters to be incorporated on the detector chip. We believe that a large collection of single-mode polarized detectors will eventually be required for the reliable detection of the weak polarized signature that is expected to result from gravitational waves produced by cosmic inflation. This focal plane prototype is an important step along the path to this detection, resulting in a capability that will enable various future high performance instrument concepts.

  5. Diffuse Cosmic Infrared Background Radiation

    Science.gov (United States)

    Dwek, Eli

    2002-01-01

    The diffuse cosmic infrared background (CIB) consists of the cumulative radiant energy released in the processes of structure formation that have occurred since the decoupling of matter and radiation following the Big Bang. In this lecture I will review the observational data that provided the first detections and limits on the CIB, and the theoretical studies explaining the origin of this background. Finally, I will also discuss the relevance of this background to the universe as seen in high energy gamma-rays.

  6. Background current of radioisotope manometer

    International Nuclear Information System (INIS)

    Vydrik, A.A.

    1987-01-01

    The technique for calculating the main component of the background current of radioisotopic monometers, current from direct collision of ionizing particles and a collector, is described. The reasons for appearance of background photoelectron current are clarified. The most effective way of eliminating background current components is collector protection from the source by a screen made of material with a high gamma-quanta absorption coefficient, such as lead, for example

  7. Correlations between coping styles and symptom expectation for whiplash injury.

    Science.gov (United States)

    Ferrari, Robert; Russell, Anthony S

    2010-11-01

    In pain conditions, active coping has been found to be associated with less severe depression, increased activity level, and less functional impairment. Studies indicate that Canadians have a high expectation for chronic pain following whiplash injury. Expectation of recovery has been shown to predict recovery in whiplash victims. The objective of this study was to compare both the expectations and the coping style for whiplash injury in injury-naive subjects. The Vanderbilt Pain Management Inventory was administered to university students. Subjects who had not yet experienced whiplash injury were given a vignette concerning a neck sprain (whiplash injury) in a motor vehicle collision and were asked to indicate how likely they were to have thoughts or behaviors indicated in the coping style questionnaire. Subjects also completed expectation questionnaires regarding whiplash injury. Subjects (57%) held an expectation of chronic pain after whiplash injury. The mean active coping style score was 28.5±6.6 (40 is the maximum score for active coping). The mean passive coping style score was 28.5±6.6 (50 is the maximum score for passive coping). Those with high passive coping styles had a higher mean expectation score. The correlation between passive coping style score and expectation score was 0.62, while the correlation between active coping style score and expectation was -0.48. Both expectations and coping styles may interact or be co-modifiers in the outcomes of whiplash injury in whiplash victims. Further studies of coping style as an etiologic factor in the chronic whiplash syndrome are needed.

  8. Background subtraction theory and practice

    CERN Document Server

    Elgammal, Ahmed

    2014-01-01

    Background subtraction is a widely used concept for detection of moving objects in videos. In the last two decades there has been a lot of development in designing algorithms for background subtraction, as well as wide use of these algorithms in various important applications, such as visual surveillance, sports video analysis, motion capture, etc. Various statistical approaches have been proposed to model scene backgrounds. The concept of background subtraction also has been extended to detect objects from videos captured from moving cameras. This book reviews the concept and practice of back

  9. Status of the Simbol-X Background Simulation Activities

    Science.gov (United States)

    Tenzer, C.; Briel, U.; Bulgarelli, A.; Chipaux, R.; Claret, A.; Cusumano, G.; Dell'Orto, E.; Fioretti, V.; Foschini, L.; Hauf, S.; Kendziorra, E.; Kuster, M.; Laurent, P.; Tiengo, A.

    2009-05-01

    The Simbol-X background simulation group is working towards a simulation based background and mass model which can be used before and during the mission. Using the Geant4 toolkit, a Monte-Carlo code to simulate the detector background of the Simbol-X focal plane instrument has been developed with the aim to optimize the design of the instrument. Achieving an overall low instrument background has direct impact on the sensitivity of Simbol-X and thus will be crucial for the success of the mission. We present results of recent simulation studies concerning the shielding of the detectors with respect to the diffuse cosmic hard X-ray background and to the cosmic-ray proton induced background. Besides estimates of the level and spectral shape of the remaining background expected in the low and high energy detector, also anti-coincidence rates and resulting detector dead time predictions are discussed.

  10. Test Expectancy Affects Metacomprehension Accuracy

    Science.gov (United States)

    Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2011-01-01

    Background: Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and…

  11. Ethical issues and societal expectations

    International Nuclear Information System (INIS)

    Metlay, D.

    2010-01-01

    Daniel Metlay (NWTRB) declared that institutions had always recognised an ethical obligation to manage high- level radioactive waste in unprecedented ways. This obligation has not only endured, but has become more explicit and multidimensional and it now subsumed under a more general rubric of 'societal expectations'. D. Metlay directed attention toward the proceedings of previous RWMC-RF workshop ', which contains five essays, authored by Kjell Andersson, Andrew Blowers, Carl-Reinhold Braakenhielm, Francois Dermange, and Patricia Fleming, that are relevant to the question of ethical issues and societal expectations. D. Metlay observed that 'societal expectations' are hard to define and thus very hard to measure. They may vary considerably with time and from country to country. As an illustration he referred to an inquiry performed by a task group 30 years ago in a document entitled 'Proposed Goals for Radioactive Waste Management' (NUREG-0300) on behalf of the U.S. Nuclear Regulatory Commission. Conclusions from D. Metlay are that, for the most part, societal expectations in the United States appear to be quite stable over a period of more than 30 years. In two areas, however, there are clear differences in emphasis between expectations articulated in the last few years and those recorded in 1978. (1) While then there was emphasis on the operational reliability of organisations and institutions. In particular, much care was taken to discuss the inherent limitations on bureaucratic error-correction in the future. The focus is nowadays more on bureaucratic behaviours associated with carrying out decision-making processes in the present. (2) While there is current emphasis on the importance of trust, transparency, and accountability, the NRC document may cast some doubt on the reliability of a stepwise decision-making process. In the domain of radioactive waste management, error signals are notoriously unclear, and strong disagreements over objectives and value trade

  12. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  13. Effect of the Great Attractor on the cosmic microwave background radiation

    Energy Technology Data Exchange (ETDEWEB)

    Bertschinger, E [Massachusetts Inst. of Tech., Cambridge, MA (USA). Dept. of Physics; Gorski, K M [Los Alamos National Lab., NM (USA); Dekel, A [Hebrew Univ., Jerusalem (Israel). Racah Inst. of Physics

    1990-06-07

    ANISOTROPY in the cosmic microwave background radiation (CMB) is expected as a result of fluctuations in gravitational potential caused by large-scale structure in the Universe. The background radiation is redshifted as it climbs out of gravitational wells. Here we present a map of the anisotropy in CMB temperature {Delta}T/T of our region of the Universe as viewed by a distant observer, predicted on the basis of the gravitational potential field. We calculate this field in the vicinity of the Local Group of galaxies from the observed peculiar (non-Hubble) velocities of galaxies, under the assumption that the peculiar motions are induced by gravity. If the cosmological density parameter {Omega} is 1, the gravitational potential field of the Great Attractor and surrounding regions produces a maximum Sachs-Wolfe anisotropy of {Delta}T/T=(1.7{plus minus}0.3) x 10{sup -5} on an angular scale of 1deg. Doppler and adiabatic contributions to this anisotropy are expected to be somewhat larger. If similar fluctuations in the gravitational potential are present elsewhere in the Universe, the anisotropy present when the CMB was last scattered should be visible from the Earth, and should be detectable in current experiments. A fundamental test of whether gravity is responsible for the generation of structure in the Universe can be made by looking for the imprint in the CMB of deep potential wells similar to those found in our neighbourhood, (author).

  14. Effects of bruxism on the maximum bite force

    Directory of Open Access Journals (Sweden)

    Todić Jelena T.

    2017-01-01

    Full Text Available Background/Aim. Bruxism is a parafunctional activity of the masticatory system, which is characterized by clenching or grinding of teeth. The purpose of this study was to determine whether the presence of bruxism has impact on maximum bite force, with particular reference to the potential impact of gender on bite force values. Methods. This study included two groups of subjects: without and with bruxism. The presence of bruxism in the subjects was registered using a specific clinical questionnaire on bruxism and physical examination. The subjects from both groups were submitted to the procedure of measuring the maximum bite pressure and occlusal contact area using a single-sheet pressure-sensitive films (Fuji Prescale MS and HS Film. Maximal bite force was obtained by multiplying maximal bite pressure and occlusal contact area values. Results. The average values of maximal bite force were significantly higher in the subjects with bruxism compared to those without bruxism (p 0.01. Maximal bite force was significantly higher in the males compared to the females in all segments of the research. Conclusion. The presence of bruxism influences the increase in the maximum bite force as shown in this study. Gender is a significant determinant of bite force. Registration of maximum bite force can be used in diagnosing and analysing pathophysiological events during bruxism.

  15. Stochastic samples versus vacuum expectation values in cosmology

    International Nuclear Information System (INIS)

    Tsamis, N.C.; Tzetzias, Aggelos; Woodard, R.P.

    2010-01-01

    Particle theorists typically use expectation values to study the quantum back-reaction on inflation, whereas many cosmologists stress the stochastic nature of the process. While expectation values certainly give misleading results for some things, such as the stress tensor, we argue that operators exist for which there is no essential problem. We quantify this by examining the stochastic properties of a noninteracting, massless, minimally coupled scalar on a locally de Sitter background. The square of the stochastic realization of this field seems to provide an example of great relevance for which expectation values are not misleading. We also examine the frequently expressed concern that significant back-reaction from expectation values necessarily implies large stochastic fluctuations between nearby spatial points. Rather than viewing the stochastic formalism in opposition to expectation values, we argue that it provides a marvelously simple way of capturing the leading infrared logarithm corrections to the latter, as advocated by Starobinsky

  16. Forecasting Spanish natural life expectancy.

    Science.gov (United States)

    Guillen, Montserrat; Vidiella-i-Anguera, Antoni

    2005-10-01

    Knowledge of trends in life expectancy is of major importance for policy planning. It is also a key indicator for assessing future development of life insurance products, substantiality of existing retirement schemes, and long-term care for the elderly. This article examines the feasibility of decomposing age-gender-specific accidental and natural mortality rates. We study this decomposition by using the Lee and Carter model. In particular, we fit the Poisson log-bilinear version of this model proposed by Wilmoth and Brouhns et al. to historical (1975-1998) Spanish mortality rates. In addition, by using the model introduced by Wilmoth and Valkonen we analyze mortality-gender differentials for accidental and natural rates. We present aggregated life expectancy forecasts compared with those constructed using nondecomposed mortality rates.

  17. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  18. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  19. FRANCHISE EXPECTATIONS: CASE OF KAZAKHSTAN

    OpenAIRE

    Raissa Kaziyeva

    2014-01-01

    The purpose of the article is to provide a critical review of franchising development in Kazakhstan by focusing on the relationship between the franchisor and the franchisee. We have conducted extensive research and communicated with lots of potential and existing Kazakhstani franchisors and franchisees, operating since 2003. Our findings show that the process of signing franchising agreements is quite challenging in Kazakhstan.  Thorough investigation of the differences between expectations ...

  20. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    The determination of soil background is one of the most important activities supporting environmental restoration and waste management on the Hanford Site. Background compositions serve as the basis for identifying soil contamination, and also as a baseline in risk assessment processes used to determine soil cleanup and treatment levels. These uses of soil background require an understanding of the extent to which analytes of concern occur naturally in the soils. This report documents the results of sampling and analysis activities designed to characterize the composition of soil background at the Hanford Site, and to evaluate the feasibility for use as Sitewide background. The compositions of naturally occurring soils in the vadose Zone have been-determined for-nonradioactive inorganic and organic analytes and related physical properties. These results confirm that a Sitewide approach to the characterization of soil background is technically sound and is a viable alternative to the determination and use of numerous local or area backgrounds that yield inconsistent definitions of contamination. Sitewide soil background consists of several types of data and is appropriate for use in identifying contamination in all soils in the vadose zone on the Hanford Site. The natural concentrations of nearly every inorganic analyte extend to levels that exceed calculated health-based cleanup limits. The levels of most inorganic analytes, however, are well below these health-based limits. The highest measured background concentrations occur in three volumetrically minor soil types, the most important of which are topsoils adjacent to the Columbia River that are rich in organic carbon. No organic analyte levels above detection were found in any of the soil samples

  1. Annual radiation background in Isfahan city

    International Nuclear Information System (INIS)

    Tavakoli, Mohammad B.

    2002-01-01

    Measurement of environmental exposure is very important from different points of view. It is especially important for human health. It has been measured accurately in many countries. In Iran, it is also measured in some cities especially in high background areas such as Ramsar, but there is not any measurement in Isfahan. Measurement of background radiation in this study is performed using TLD method. The TLDs used are made from CaSO 4 :Dy, which is very sensitive. The locations under investigations in this research were 52 health centers distributed all around Isfahan city. Each TLD badge was put in a special plastic bag and left over the roofs of the selected health center for a month. The procedure was repeated for all 12 months of the year 1379(21 st March 2000 to 20 th March 2001). The results were used to obtain mean and SD in each month and at different places. The maximum and the minimum of obtained results for dose equivalent in different months and locations were 15.9x10 -2 and 6.5x10 -2 mSv. Obtained maximum and minimum of the means between all the locations were 10.5x10 -2 and 8.6x10 2 mSv for the whole year. Monthly mean and SD for Isfahan city for the whole year were 9.7x10 -2 and 1.5x10 -2 respectively therefore mean annual dose equivalent in Isfahan city is 1.16mSv. The results do not show any high background radiation area

  2. Grade Expectations: Rationality and Overconfidence

    Science.gov (United States)

    Magnus, Jan R.; Peresetsky, Anatoly A.

    2018-01-01

    Confidence and overconfidence are essential aspects of human nature, but measuring (over)confidence is not easy. Our approach is to consider students' forecasts of their exam grades. Part of a student's grade expectation is based on the student's previous academic achievements; what remains can be interpreted as (over)confidence. Our results are based on a sample of about 500 second-year undergraduate students enrolled in a statistics course in Moscow. The course contains three exams and each student produces a forecast for each of the three exams. Our models allow us to estimate overconfidence quantitatively. Using these models we find that students' expectations are not rational and that most students are overconfident, in agreement with the general literature. Less obvious is that overconfidence helps: given the same academic achievement students with larger confidence obtain higher exam grades. Female students are less overconfident than male students, their forecasts are more rational, and they are also faster learners in the sense that they adjust their expectations more rapidly. PMID:29375449

  3. Life Expectancy of Brazilian Neurosurgeons.

    Science.gov (United States)

    Botelho, Ricardo Vieira; Jardim Miranda, Bárbara Cristina; Nishikuni, Koshiro; Waisberg, Jaques

    2018-06-01

    Life expectancy (LE) refers to the number of years that an individual is expected to survive. Emphasis is frequently placed on the relationship between LE and the conditions under which a population lives, but fewer studies have investigated the relationship between stress factors associated with specific professions and their effects on LE. The aim of this study is to evaluate Brazilian neurosurgeons' life expectancies (BNLEs) and compare them with those of physicians (both Brazilian and foreign) from other fields, as well as with Brazilian nondoctors. The Brazilian Society of Neurosurgery death registry was used to obtain data that compared LEs from non-neurosurgeon physicians, as described in the national and international literature. BNLEs were also compared with the LEs of Brazilian citizens. Fifty-one neurosurgeons died between 2009 and 2016. All were males. The mean age at death was 68.31 ± 17.71 years. Among all-cause mortality, the breakdown was 20% cardiovascular diseases, 39% malignancies, 10% external factors, 6% gastrointestinal disorders, 12% neurologic illnesses, and 14% unknown causes. BNLE was shorter than LE of male Brazilian citizens. LE was similar among neurosurgeons and other doctors but shorter compared with Brazilian citizens. Further research is needed to provide data that can add to and confirm these results. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Grade Expectations: Rationality and Overconfidence

    Directory of Open Access Journals (Sweden)

    Jan R. Magnus

    2018-01-01

    Full Text Available Confidence and overconfidence are essential aspects of human nature, but measuring (overconfidence is not easy. Our approach is to consider students' forecasts of their exam grades. Part of a student's grade expectation is based on the student's previous academic achievements; what remains can be interpreted as (overconfidence. Our results are based on a sample of about 500 second-year undergraduate students enrolled in a statistics course in Moscow. The course contains three exams and each student produces a forecast for each of the three exams. Our models allow us to estimate overconfidence quantitatively. Using these models we find that students' expectations are not rational and that most students are overconfident, in agreement with the general literature. Less obvious is that overconfidence helps: given the same academic achievement students with larger confidence obtain higher exam grades. Female students are less overconfident than male students, their forecasts are more rational, and they are also faster learners in the sense that they adjust their expectations more rapidly.

  5. Price expectations and petroleum development

    International Nuclear Information System (INIS)

    Pollio, G.; Marian, W.S.

    1991-01-01

    In the first section of this paper, the authors present a highly stylized model of the world oil market that explicitly incorporates both expectative and financial effects. The model generates the extremely interesting result that actual future price outcomes are inversely related to prevailing price expectations, owing to fluctuation in the level and timing of industry investment expenditure. Given the importance of price expectations, it is surprising that the topic has received such scant attention. The authors therefore present in the second section of selective survey of the various measures that have been proposed and used in the literature, as well as an assessment of the value of potentially new indices and market prices for existing hydrocarbon reserves, for example. In the final section of the paper, we discuss the extent to which financial innovation, in the form of commodity-linked products-such as swaps, caps, collars, and so forth-are transforming the oil market, enabling all market segments to manage price uncertainty far more effectively than was ever possible in the past

  6. Measurement of natural background neutron

    CERN Document Server

    Li Jain, Ping; Tang Jin Hua; Tang, E S; Xie Yan Fong

    1982-01-01

    A high sensitive neutron monitor is described. It has an approximate counting rate of 20 cpm for natural background neutrons. The pulse amplitude resolution, sensitivity and direction dependence of the monitor were determined. This monitor has been used for natural background measurement in Beijing area. The yearly average dose is given and compared with the results of KEK and CERN.

  7. The maximum economic depth of groundwater abstraction for irrigation

    Science.gov (United States)

    Bierkens, M. F.; Van Beek, L. P.; de Graaf, I. E. M.; Gleeson, T. P.

    2017-12-01

    Over recent decades, groundwater has become increasingly important for agriculture. Irrigation accounts for 40% of the global food production and its importance is expected to grow further in the near future. Already, about 70% of the globally abstracted water is used for irrigation, and nearly half of that is pumped groundwater. In many irrigated areas where groundwater is the primary source of irrigation water, groundwater abstraction is larger than recharge and we see massive groundwater head decline in these areas. An important question then is: to what maximum depth can groundwater be pumped for it to be still economically recoverable? The objective of this study is therefore to create a global map of the maximum depth of economically recoverable groundwater when used for irrigation. The maximum economic depth is the maximum depth at which revenues are still larger than pumping costs or the maximum depth at which initial investments become too large compared to yearly revenues. To this end we set up a simple economic model where costs of well drilling and the energy costs of pumping, which are a function of well depth and static head depth respectively, are compared with the revenues obtained for the irrigated crops. Parameters for the cost sub-model are obtained from several US-based studies and applied to other countries based on GDP/capita as an index of labour costs. The revenue sub-model is based on gross irrigation water demand calculated with a global hydrological and water resources model, areal coverage of crop types from MIRCA2000 and FAO-based statistics on crop yield and market price. We applied our method to irrigated areas in the world overlying productive aquifers. Estimated maximum economic depths range between 50 and 500 m. Most important factors explaining the maximum economic depth are the dominant crop type in the area and whether or not initial investments in well infrastructure are limiting. In subsequent research, our estimates of

  8. Immigrant students’ educational expectations : The role of religious affiliation and practice

    NARCIS (Netherlands)

    Hemmerechts, K.; Kavadias, D.; Agirdag, O.

    2018-01-01

    A body of scholarly work has emerged on educational expectations. More recently, the relationship between educational expectations and immigrant background in Western Europe has been investigated. Although the results of this type of inquiry show that students with an immigrant background tend to

  9. The determination and use of radionuclide background in gamma spectrometry

    International Nuclear Information System (INIS)

    Zimmer, W.H.

    1986-01-01

    Background is the major component of gross photon peak area. Therefore, net area, nuclide activity, counting uncertainty, and limits of detection calculations are no better than the calculation of background. In this study, background in gamma spectrometry is explored in several of its aspects. Means are presented to reduce background. Standard practices are presented to be used in the acquisition of valid, relevant background data. Unified standard calculations with examples are presented in the use of background data to determine net count and counting uncertainty. L. A. Currie's latest calculations of Lower Limits of Detection (1) (LLD) as they apply to gamma spectrometry are reviewed. Finally, Maximum Undetected Activity (MUA), LLD, and Critical Level (CL) concepts and calculations are compared in sample spectra

  10. Maximum neutron flux in thermal reactors

    International Nuclear Information System (INIS)

    Strugar, P.V.

    1968-12-01

    Direct approach to the problem is to calculate spatial distribution of fuel concentration if the reactor core directly using the condition of maximum neutron flux and comply with thermal limitations. This paper proved that the problem can be solved by applying the variational calculus, i.e. by using the maximum principle of Pontryagin. Mathematical model of reactor core is based on the two-group neutron diffusion theory with some simplifications which make it appropriate from maximum principle point of view. Here applied theory of maximum principle are suitable for application. The solution of optimum distribution of fuel concentration in the reactor core is obtained in explicit analytical form. The reactor critical dimensions are roots of a system of nonlinear equations and verification of optimum conditions can be done only for specific examples

  11. Maximum allowable load on wheeled mobile manipulators

    International Nuclear Information System (INIS)

    Habibnejad Korayem, M.; Ghariblu, H.

    2003-01-01

    This paper develops a computational technique for finding the maximum allowable load of mobile manipulator during a given trajectory. The maximum allowable loads which can be achieved by a mobile manipulator during a given trajectory are limited by the number of factors; probably the dynamic properties of mobile base and mounted manipulator, their actuator limitations and additional constraints applied to resolving the redundancy are the most important factors. To resolve extra D.O.F introduced by the base mobility, additional constraint functions are proposed directly in the task space of mobile manipulator. Finally, in two numerical examples involving a two-link planar manipulator mounted on a differentially driven mobile base, application of the method to determining maximum allowable load is verified. The simulation results demonstrates the maximum allowable load on a desired trajectory has not a unique value and directly depends on the additional constraint functions which applies to resolve the motion redundancy

  12. Maximum phytoplankton concentrations in the sea

    DEFF Research Database (Denmark)

    Jackson, G.A.; Kiørboe, Thomas

    2008-01-01

    A simplification of plankton dynamics using coagulation theory provides predictions of the maximum algal concentration sustainable in aquatic systems. These predictions have previously been tested successfully against results from iron fertilization experiments. We extend the test to data collect...

  13. Aluminum as a source of background in low background experiments

    Energy Technology Data Exchange (ETDEWEB)

    Majorovits, B., E-mail: bela@mppmu.mpg.de [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Abt, I. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Laubenstein, M. [Laboratori Nazionali del Gran Sasso, INFN, S.S.17/bis, km 18 plus 910, I-67100 Assergi (Italy); Volynets, O. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2011-08-11

    Neutrinoless double beta decay would be a key to understanding the nature of neutrino masses. The next generation of High Purity Germanium experiments will have to be operated with a background rate of better than 10{sup -5} counts/(kg y keV) in the region of interest around the Q-value of the decay. Therefore, so far irrelevant sources of background have to be considered. The metalization of the surface of germanium detectors is in general done with aluminum. The background from the decays of {sup 22}Na, {sup 26}Al, {sup 226}Ra and {sup 228}Th introduced by this metalization is discussed. It is shown that only a special selection of aluminum can keep these background contributions acceptable.

  14. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  15. Patients' Preoperative Expectation and Outcome of Cataract Surgery ...

    African Journals Online (AJOL)

    BACKGROUND: Patient's satisfaction for a given treatment is an important clinical outcome because a satisfied patient is more likely to comply with treatments, attend follow-ups and advocate the service to others. Therefore, knowing patients' expectations before a planned procedure or treatment and the actual level of ...

  16. Social Capital and the Educational Expectations of Young People

    Science.gov (United States)

    Behtoui, Alireza

    2017-01-01

    The aim of this study is to explore the determinants of the educational expectations of young people in disadvantaged urban areas in three large cities in Sweden. In addition to the conventional predictors such as parental resources (economic and cultural capital) and demographic characteristics (such as age, gender, immigration background), this…

  17. Effects of physical activity on life expectancy with cardiovascular disease

    NARCIS (Netherlands)

    O.H. Franco (Oscar); C.E.D. de Laet (Chris); A. Peeters (Andrea); J. Jonker (Joost); J.P. Mackenbach (Johan); W.J. Nusselder (Wilma)

    2005-01-01

    textabstractBackground: Physical inactivity is a modifiable risk factor for cardiovascular disease. However, little is known about the effects of physical activity on life expectancy with and without cardiovascular disease. Our objective was to calculate the consequences of different physical

  18. JEM-X background models

    DEFF Research Database (Denmark)

    Huovelin, J.; Maisala, S.; Schultz, J.

    2003-01-01

    Background and determination of its components for the JEM-X X-ray telescope on INTEGRAL are discussed. A part of the first background observations by JEM-X are analysed and results are compared to predictions. The observations are based on extensive imaging of background near the Crab Nebula...... on revolution 41 of INTEGRAL. Total observing time used for the analysis was 216 502 s, with the average of 25 cps of background for each of the two JEM-X telescopes. JEM-X1 showed slightly higher average background intensity than JEM-X2. The detectors were stable during the long exposures, and weak orbital...... background was enhanced in the central area of a detector, and it decreased radially towards the edge, with a clear vignetting effect for both JEM-X units. The instrument background was weakest in the central area of a detector and showed a steep increase at the very edges of both JEM-X detectors...

  19. On heterotic vacua with fermionic expectation values

    Energy Technology Data Exchange (ETDEWEB)

    Minasian, Ruben [Institut de Physique Theorique, Universite Paris Saclay, CEA, CNRS, Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Universites, CNRS, LPTHE, UPMC Paris 06, UMR 7589, Paris (France); Svanes, Eirik Eik [Sorbonne Universites, CNRS, LPTHE, UPMC Paris 06, UMR 7589, Paris (France); Sorbonne Universites, Institut Lagrange de Paris, Paris (France)

    2017-03-15

    We study heterotic backgrounds with non-trivial H-flux and non-vanishing expectation values of fermionic bilinears, often referred to as gaugino condensates. The gaugini appear in the low energy action via the gauge-invariant three-form bilinear Σ{sub MNP} = tr anti χΓ{sub MNP}χ. For Calabi-Yau compactifications to four dimensions, the gaugino condensate corresponds to an internal three-form Σ{sub mnp} that must be a singlet of the holonomy group. This condition does not hold anymore when an internal H-flux is turned on and O(α{sup '}) effects are included. In this paper we study flux compactifications to three and four-dimensions on G-structure manifolds. We derive the generic conditions for supersymmetric solutions. We use integrability conditions and Lichnerowicz type arguments to derive a set of constraints whose solution, together with supersymmetry, is sufficient for finding backgrounds with gaugino condensate. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  20. ECONOMIC REASONING MAXIMUM SLOPE IN DESIGN HIGH-SPEED LINES

    Directory of Open Access Journals (Sweden)

    CHERNYSHOVA O. S.

    2016-04-01

    Full Text Available Raising of problem The worldwide design standards high-speed lines are somewhat different. This is due to several reasons: different levels of design speed, differences of characteristics of rolling stock and, in particular, the features of the design plan and longitudinal profile, that are associated primarily with the conditions of the relief. In the design of high-speed railways in Ukraine should take into account these features and determine what the maximum slope values can be used in difficult conditions, as well as how it will affect the operational and capital costs. Purpose. To determine the optimal design parameters of the longitudinal profile. Conclusion. The results are based not only on technical, but also economic indicators and allow the assessment of the necessary capital expenditures and expected cost of the railway in the future. Analytical dependences, to predict the expected operating costs of the railway, depending on the maximum slope, its length and the total length of the section.

  1. Factors Influencing Expectations of Physical Activity for Adolescents Residing in Appalachia

    Science.gov (United States)

    Elkins, Rebecca L.; Nabors, Laura; King, Keith; Vidourek, Rebecca

    2015-01-01

    Background: Appalachian adolescents are at an increased risk for sedentary behavior; little research has addressed this concern. Purpose: This study examined adolescents' expectations for engaging in physical activity (PA), chiefly expectations for relaxation and fitness. Independent variables were self-efficacy expectations (SEEs) to overcome…

  2. Stochastic backgrounds of gravitational waves

    International Nuclear Information System (INIS)

    Maggiore, M.

    2001-01-01

    We review the motivations for the search for stochastic backgrounds of gravitational waves and we compare the experimental sensitivities that can be reached in the near future with the existing bounds and with the theoretical predictions. (author)

  3. Berkeley Low Background Counting Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Sensitive low background assay detectors and sample analysis are available for non-destructive direct gamma-ray assay of samples. Neutron activation analysis is also...

  4. Spectral characterization of natural backgrounds

    Science.gov (United States)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  5. Probable Maximum Earthquake Magnitudes for the Cascadia Subduction

    Science.gov (United States)

    Rong, Y.; Jackson, D. D.; Magistrale, H.; Goldfinger, C.

    2013-12-01

    The concept of maximum earthquake magnitude (mx) is widely used in seismic hazard and risk analysis. However, absolute mx lacks a precise definition and cannot be determined from a finite earthquake history. The surprising magnitudes of the 2004 Sumatra and the 2011 Tohoku earthquakes showed that most methods for estimating mx underestimate the true maximum if it exists. Thus, we introduced the alternate concept of mp(T), probable maximum magnitude within a time interval T. The mp(T) can be solved using theoretical magnitude-frequency distributions such as Tapered Gutenberg-Richter (TGR) distribution. The two TGR parameters, β-value (which equals 2/3 b-value in the GR distribution) and corner magnitude (mc), can be obtained by applying maximum likelihood method to earthquake catalogs with additional constraint from tectonic moment rate. Here, we integrate the paleoseismic data in the Cascadia subduction zone to estimate mp. The Cascadia subduction zone has been seismically quiescent since at least 1900. Fortunately, turbidite studies have unearthed a 10,000 year record of great earthquakes along the subduction zone. We thoroughly investigate the earthquake magnitude-frequency distribution of the region by combining instrumental and paleoseismic data, and using the tectonic moment rate information. To use the paleoseismic data, we first estimate event magnitudes, which we achieve by using the time interval between events, rupture extent of the events, and turbidite thickness. We estimate three sets of TGR parameters: for the first two sets, we consider a geographically large Cascadia region that includes the subduction zone, and the Explorer, Juan de Fuca, and Gorda plates; for the third set, we consider a narrow geographic region straddling the subduction zone. In the first set, the β-value is derived using the GCMT catalog. In the second and third sets, the β-value is derived using both the GCMT and paleoseismic data. Next, we calculate the corresponding mc

  6. Regulatory Expectations for Safety Culture

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Su Jin; Oh, Jang Jin; Choi, Young Sung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    The oversight of licensee's safety culture becomes an important issue that attracts great public and political concerns recently in Korea. Beginning from the intended violation of rules, a series of corruptions, documents forgery and disclosure of wrong-doings made the public think that the whole mindset of nuclear workers has been inadequate. Thus, they are demanding that safety culture shall be improved and that regulatory body shall play more roles and responsibilities for the improvements and oversight for them. This paper introduces, as an effort of regulatory side, recent changes in the role of regulators in safety culture, regulatory expectations on the desired status of licensee's safety culture, the pilot inspection program for safety culture and research activity for the development of oversight system. After the Fukushima accident in Japan 2011, many critics has searched for cultural factors that caused the unacceptable negligence pervaded in Japan nuclear society and the renewed emphasis has been placed on rebuilding safety culture by operators, regulators, and relevant institutions globally. Significant progress has been made in how to approach safety culture and led to a new perspective different from the existing normative assessment method both in operators and regulatory side. Regulatory expectations and oversight of them are based on such a new holistic concept for human, organizational and cultural elements to maintain and strengthen the integrity of defense in depth and consequently nuclear safety.

  7. Expectation values in quantum gravity

    International Nuclear Information System (INIS)

    Jordan, R.D.

    1986-01-01

    The purpose of this dissertation is to develop new methods for calculating expectation values of field operators, in situations where particle creation is important. The goal is to apply these techniques to quantum gravity, to see if the initial singularity in the universe might be avoided in the quantum theory. Standard effective action theory is modified to produce effective field equations satisfied by the expectation value of the field in an in state, as opposed to the usual in-out amplitude. Diagrammatic rules are found for calculation of the new field equations, and are used to show that the equations are real and causal up to two loop order. The theory also provides a simple check of unitarity, which is carried out, again up to two loops. Just as the standard effective field equations can be derived by analytic continuation from a theory defined in Euclidean space, so can the modified equations be obtained from a modified contour rotation of the Euclidean theory. This result is used to prove a recent conjecture which yields a simple rule for finding the real, causal equations. The new formalism is applied to two gravitational systems. First, the stability of flat space time is studied by finding the equation satisfied by small perturbations of Minkowski space

  8. Regulatory Expectations for Safety Culture

    International Nuclear Information System (INIS)

    Jung, Su Jin; Oh, Jang Jin; Choi, Young Sung

    2014-01-01

    The oversight of licensee's safety culture becomes an important issue that attracts great public and political concerns recently in Korea. Beginning from the intended violation of rules, a series of corruptions, documents forgery and disclosure of wrong-doings made the public think that the whole mindset of nuclear workers has been inadequate. Thus, they are demanding that safety culture shall be improved and that regulatory body shall play more roles and responsibilities for the improvements and oversight for them. This paper introduces, as an effort of regulatory side, recent changes in the role of regulators in safety culture, regulatory expectations on the desired status of licensee's safety culture, the pilot inspection program for safety culture and research activity for the development of oversight system. After the Fukushima accident in Japan 2011, many critics has searched for cultural factors that caused the unacceptable negligence pervaded in Japan nuclear society and the renewed emphasis has been placed on rebuilding safety culture by operators, regulators, and relevant institutions globally. Significant progress has been made in how to approach safety culture and led to a new perspective different from the existing normative assessment method both in operators and regulatory side. Regulatory expectations and oversight of them are based on such a new holistic concept for human, organizational and cultural elements to maintain and strengthen the integrity of defense in depth and consequently nuclear safety

  9. Cosmic microwave background, where next?

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Ground-based, balloon-borne and space-based experiments will observe the Cosmic Microwave Background in greater details to address open questions about the origin and the evolution of the Universe. In particular, detailed observations the polarization pattern of the Cosmic Microwave Background radiation have the potential to directly probe physics at the GUT scale and illuminate aspects of the physics of the very early Universe.

  10. Applications of expectation maximization algorithm for coherent optical communication

    DEFF Research Database (Denmark)

    Carvalho, L.; Oliveira, J.; Zibar, Darko

    2014-01-01

    In this invited paper, we present powerful statistical signal processing methods, used by machine learning community, and link them to current problems in optical communication. In particular, we will look into iterative maximum likelihood parameter estimation based on expectation maximization...... algorithm and its application in coherent optical communication systems for linear and nonlinear impairment mitigation. Furthermore, the estimated parameters are used to build the probabilistic model of the system for the synthetic impairment generation....

  11. Maximum gravitational redshift of white dwarfs

    International Nuclear Information System (INIS)

    Shapiro, S.L.; Teukolsky, S.A.

    1976-01-01

    The stability of uniformly rotating, cold white dwarfs is examined in the framework of the Parametrized Post-Newtonian (PPN) formalism of Will and Nordtvedt. The maximum central density and gravitational redshift of a white dwarf are determined as functions of five of the nine PPN parameters (γ, β, zeta 2 , zeta 3 , and zeta 4 ), the total angular momentum J, and the composition of the star. General relativity predicts that the maximum redshifts is 571 km s -1 for nonrotating carbon and helium dwarfs, but is lower for stars composed of heavier nuclei. Uniform rotation can increase the maximum redshift to 647 km s -1 for carbon stars (the neutronization limit) and to 893 km s -1 for helium stars (the uniform rotation limit). The redshift distribution of a larger sample of white dwarfs may help determine the composition of their cores

  12. Reservation wages, expected wages and unemployment

    OpenAIRE

    Brown, S; Taylor, K

    2013-01-01

    We model unemployment duration, reservation and expected wages simultaneously for individuals not in work, where wage expectations are identified via an exogenous policy shock. The policy shock increased expected wages, which were found to be positively associated with reservation wages.

  13. Looking for Cosmic Neutrino Background

    Directory of Open Access Journals (Sweden)

    Chiaki eYanagisawa

    2014-06-01

    Full Text Available Since the discovery of neutrino oscillation in atmospheric neutrinos by the Super-Kamiokande experiment in 1998, study of neutrinos has been one of exciting fields in high-energy physics. All the mixing angles were measured. Quests for 1 measurements of the remaining parameters, the lightest neutrino mass, the CP violating phase(s, and the sign of mass splitting between the mass eigenstates m3 and m1, and 2 better measurements to determine whether the mixing angle theta23 is less than pi/4, are in progress in a well-controlled manner. Determining the nature of neutrinos, whether they are Dirac or Majorana particles is also in progress with continuous improvement. On the other hand, although the ideas of detecting cosmic neutrino background have been discussed since 1960s, there has not been a serious concerted effort to achieve this goal. One of the reasons is that it is extremely difficult to detect such low energy neutrinos from the Big Bang. While there has been tremendous accumulation of information on Cosmic Microwave Background since its discovery in 1965, there is no direct evidence for Cosmic Neutrino Background. The importance of detecting Cosmic Neutrino Background is that, although detailed studies of Big Bang Nucleosynthesis and Cosmic Microwave Background give information of the early Universe at ~a few minutes old and ~300 k years old, respectively, observation of Cosmic Neutrino Background allows us to study the early Universe at $sim$ 1 sec old. This article reviews progress made in the past 50 years on detection methods of Cosmic Neutrino Background.

  14. When expectation confounds iconic memory.

    Science.gov (United States)

    Bachmann, Talis; Aru, Jaan

    2016-10-01

    In response to the methodological criticism (Bachmann & Aru, 2015) of the interpretation of their earlier experimental results (Mack, Erol, & Clarke, 2015) Mack, Erol, Clarke, and Bert (2016) presented new results that they interpret again in favor of the stance that an attention-free phenomenal iconic store does not exist. Here we once more question their conclusions. When their subjects were unexpectedly asked to report the letters instead of the post-cued circles in the 101th trial where letters were actually absent, they likely failed to see the empty display area because prior experience with letters in the preceding trials produced expectancy based illusory experience of letter-like objects. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The Maximum Resource Bin Packing Problem

    DEFF Research Database (Denmark)

    Boyar, J.; Epstein, L.; Favrholdt, L.M.

    2006-01-01

    Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...

  16. Shower maximum detector for SDC calorimetry

    International Nuclear Information System (INIS)

    Ernwein, J.

    1994-01-01

    A prototype for the SDC end-cap (EM) calorimeter complete with a pre-shower and a shower maximum detector was tested in beams of electrons and Π's at CERN by an SDC subsystem group. The prototype was manufactured from scintillator tiles and strips read out with 1 mm diameter wave-length shifting fibers. The design and construction of the shower maximum detector is described, and results of laboratory tests on light yield and performance of the scintillator-fiber system are given. Preliminary results on energy and position measurements with the shower max detector in the test beam are shown. (authors). 4 refs., 5 figs

  17. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  18. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  19. Gamma background irradiation. Standards and reality

    International Nuclear Information System (INIS)

    Miloslavov, V.

    1998-01-01

    The systematic deviation of the results of measuring the power of air dose absorbed from the natural gamma background radiation in Bulgaria is inadmissibly large and variable. This in turn augments the dispersion of results as well as the mean value relative to worldwide data, to an implausible level, hardly attributable to the variegated geographical relief of the country. Thus in practice local anthropogenic increases hardly lend themselves to detection and demonstration. In the Radiation Protection Standards (RPS-92) in effect in Bulgaria, and in other documents concerning the same radiation factors as well, the maximum allowable limits for the population as a whole are clearly specified on the basis of worldwide expertise along this line. As a rule these limits are being exceeded by the actually measured values, and for this reason the cited documents contain a clause stipulating that these limits do not refer to the natural radiation background and therefore the latter may be virtually ignored. Thus the basic risk factor for the population goes beyond control at levels commensurable with the officially established limits, its twofold increase inclusive. The maximum allowable limit becomes undefinable. Bearing in mind the fact that in compliance with the cited RPS-92 elimination of the technogenic ionizing radiation sources incorporated in the environment prior to 1992 is 'freezed', it is evident that exposure of the population to anthropogenic radiation becomes legally allowable in a much wider range than the one specified by world legislators. One may anticipate radiation induced health noxae for the population directly or by anthropogenic radiation stress on biocenosis. A relatively large part of the population is susceptible to the effect of low radiation doses. Presumably this contingent will augment as a result of eventual fluctuations. The casual relationship which is difficult to establish should be given due consideration in the analysis of the causes

  20. Gender difference in health expectancy trends in Greenland

    DEFF Research Database (Denmark)

    Mairey, Isabelle; Bjerregaard, Peter; Brønnum-Hansen, Henrik

    2014-01-01

    longstanding illness supports the theory of compression of morbidity, but as the trend direction differs according to which measure for health is used, a definite conclusion cannot be drawn. The different rate of development of partial life expectancy and expected lifetime in good health between men and women......Background: The population of Greenland comprises almost 31 000 Inuit Greenlanders aged 20-65. The purpose of this study was to estimate trends in expected life years between age 20 and 65 in good and poor health, and to compare changes between men and women since the mid-1990s. Methods: Partial...... life expectancy was calculated and combined with prevalence data on self-rated health, longstanding illness and musculoskeletal diseases derived from health surveys carried out in 1993-94, 1999-2001 and 2005-10. Trends for men and women were compared and changes were decomposed into contributions from...

  1. Study of forecasting maximum demand of electric power

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, B.C.; Hwang, Y.J. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    As far as the past performances of power supply and demand in Korea is concerned, one of the striking phenomena is that there have been repeated periodic surpluses and shortages of power generation facilities. Precise assumption and prediction of power demands is the basic work in establishing a supply plan and carrying out the right policy since facilities investment of the power generation industry requires a tremendous amount of capital and a long construction period. The purpose of this study is to study a model for the inference and prediction of a more precise maximum demand under these backgrounds. The non-parametric model considered in this study, paying attention to meteorological factors such as temperature and humidity, does not have a simple proportionate relationship with the maximum power demand, but affects it through mutual complicated nonlinear interaction. I used the non-parametric inference technique by introducing meteorological effects without importing any literal assumption on the interaction of temperature and humidity preliminarily. According to the analysis result, it is found that the non-parametric model that introduces the number of tropical nights which shows the continuity of the meteorological effect has better prediction power than the linear model. The non- parametric model that considers both the number of tropical nights and the number of cooling days at the same time is a model for predicting maximum demand. 7 refs., 6 figs., 9 tabs.

  2. Durability 2007. Injection grout investigations. Background description

    International Nuclear Information System (INIS)

    Orantie, K.; Kuosa, H.

    2008-12-01

    The aim of this project was to evaluate the durability risks of injection grouts. The investigations were done with respect to the application conditions, materials and service life requirements at the ONKALO underground research facility. The study encompassed injection grout mixtures made of ultrafine cement with and without silica fume. Some of the mixtures hade a low pH and thus a high silica fume content. The project includes a background description on durability literature, laboratory testing programme, detailed analysis of results and recommendations for selecting of ideal grout mixtures. The background description was made for the experimental study of low-pH and reference rock injection grouts as regards pore- and microstructure, strength, shrinkage/swelling and thus versatile durability properties. A summary of test methods is presented as well as examples, i.e. literature information or former test results, of expected range of results from the tests. Also background information about how the test results correlate to other material properties and mix designs is presented. Besides the report provides basic information on the pore structure of cement based materials. Also the correlation between the pore structure of cement based materials and permeability is shortly discussed. The test methods included in the background description are compressive strength, measurement of bulk drying, autogenous and chemical shrinkage and swelling, hydraulic conductivity / permeability, capillary water uptake test, mercury intrusion porosimetry (MIP) and thin section analysis. Three main mixtures with water-binder ratio of 0.8, 1.0 and 1.4 and silica fume content of 0, 15 and 40% were studied in the laboratory. Besides two extra mixtures were studied to provide additional information about the effect of varying water-dry-material ratio and silica fume content on durability. The evaluation of water tightness based on water permeability coefficient and micro cracking was

  3. Neutron background estimates in GESA

    Directory of Open Access Journals (Sweden)

    Fernandes A.C.

    2014-01-01

    Full Text Available The SIMPLE project looks for nuclear recoil events generated by rare dark matter scattering interactions. Nuclear recoils are also produced by more prevalent cosmogenic neutron interactions. While the rock overburden shields against (μ,n neutrons to below 10−8 cm−2 s−1, it itself contributes via radio-impurities. Additional shielding of these is similar, both suppressing and contributing neutrons. We report on the Monte Carlo (MCNP estimation of the on-detector neutron backgrounds for the SIMPLE experiment located in the GESA facility of the Laboratoire Souterrain à Bas Bruit, and its use in defining additional shielding for measurements which have led to a reduction in the extrinsic neutron background to ∼ 5 × 10−3 evts/kgd. The calculated event rate induced by the neutron background is ∼ 0,3 evts/kgd, with a dominant contribution from the detector container.

  4. LOFT gamma densitometer background fluxes

    International Nuclear Information System (INIS)

    Grimesey, R.A.; McCracken, R.T.

    1978-01-01

    Background gamma-ray fluxes were calculated at the location of the γ densitometers without integral shielding at both the hot-leg and cold-leg primary piping locations. The principal sources for background radiation at the γ densitometers are 16 N activity from the primary piping H 2 O and γ radiation from reactor internal sources. The background radiation was calculated by the point-kernel codes QAD-BSA and QAD-P5A. Reasonable assumptions were required to convert the response functions calculated by point-kernel procedures into the gamma-ray spectrum from reactor internal sources. A brief summary of point-kernel equations and theory is included

  5. A definition of background independence

    International Nuclear Information System (INIS)

    Gryb, Sean

    2010-01-01

    We propose a definition for background (in)/dependence in dynamical theories of the evolution of configurations that have a continuous symmetry and test this definition on particle models and on gravity. Our definition draws from Barbour's best matching framework developed for the purpose of implementing spatial and temporal relationalism. Among other interesting theories, general relativity can be derived within this framework in novel ways. We study the detailed canonical structure of a wide range of best matching theories and show that their actions must have a local gauge symmetry. When gauge theory is derived in this way, we obtain at the same time a conceptual framework for distinguishing between background-dependent and -independent theories. Gauge invariant observables satisfying Kuchar's criterion are identified and, in simple cases, explicitly computed. We propose a procedure for inserting a global background time into temporally relational theories. Interestingly, using this procedure in general relativity leads to unimodular gravity.

  6. Nonsymmetric entropy and maximum nonsymmetric entropy principle

    International Nuclear Information System (INIS)

    Liu Chengshi

    2009-01-01

    Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.

  7. Maximum speed of dewetting on a fiber

    NARCIS (Netherlands)

    Chan, Tak Shing; Gueudre, Thomas; Snoeijer, Jacobus Hendrikus

    2011-01-01

    A solid object can be coated by a nonwetting liquid since a receding contact line cannot exceed a critical speed. We theoretically investigate this forced wetting transition for axisymmetric menisci on fibers of varying radii. First, we use a matched asymptotic expansion and derive the maximum speed

  8. Maximum potential preventive effect of hip protectors

    NARCIS (Netherlands)

    van Schoor, N.M.; Smit, J.H.; Bouter, L.M.; Veenings, B.; Asma, G.B.; Lips, P.T.A.M.

    2007-01-01

    OBJECTIVES: To estimate the maximum potential preventive effect of hip protectors in older persons living in the community or homes for the elderly. DESIGN: Observational cohort study. SETTING: Emergency departments in the Netherlands. PARTICIPANTS: Hip fracture patients aged 70 and older who

  9. Maximum gain of Yagi-Uda arrays

    DEFF Research Database (Denmark)

    Bojsen, J.H.; Schjær-Jacobsen, Hans; Nilsson, E.

    1971-01-01

    Numerical optimisation techniques have been used to find the maximum gain of some specific parasitic arrays. The gain of an array of infinitely thin, equispaced dipoles loaded with arbitrary reactances has been optimised. The results show that standard travelling-wave design methods are not optimum....... Yagi–Uda arrays with equal and unequal spacing have also been optimised with experimental verification....

  10. correlation between maximum dry density and cohesion

    African Journals Online (AJOL)

    HOD

    represents maximum dry density, signifies plastic limit and is liquid limit. Researchers [6, 7] estimate compaction parameters. Aside from the correlation existing between compaction parameters and other physical quantities there are some other correlations that have been investigated by other researchers. The well-known.

  11. The maximum-entropy method in superspace

    Czech Academy of Sciences Publication Activity Database

    van Smaalen, S.; Palatinus, Lukáš; Schneider, M.

    2003-01-01

    Roč. 59, - (2003), s. 459-469 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method, * aperiodic crystals * electron density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.558, year: 2003

  12. Achieving maximum sustainable yield in mixed fisheries

    NARCIS (Netherlands)

    Ulrich, Clara; Vermard, Youen; Dolder, Paul J.; Brunel, Thomas; Jardim, Ernesto; Holmes, Steven J.; Kempf, Alexander; Mortensen, Lars O.; Poos, Jan Jaap; Rindorf, Anna

    2017-01-01

    Achieving single species maximum sustainable yield (MSY) in complex and dynamic fisheries targeting multiple species (mixed fisheries) is challenging because achieving the objective for one species may mean missing the objective for another. The North Sea mixed fisheries are a representative example

  13. 5 CFR 534.203 - Maximum stipends.

    Science.gov (United States)

    2010-01-01

    ... maximum stipend established under this section. (e) A trainee at a non-Federal hospital, clinic, or medical or dental laboratory who is assigned to a Federal hospital, clinic, or medical or dental... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER OTHER SYSTEMS Student...

  14. Minimal length, Friedmann equations and maximum density

    Energy Technology Data Exchange (ETDEWEB)

    Awad, Adel [Center for Theoretical Physics, British University of Egypt,Sherouk City 11837, P.O. Box 43 (Egypt); Department of Physics, Faculty of Science, Ain Shams University,Cairo, 11566 (Egypt); Ali, Ahmed Farag [Centre for Fundamental Physics, Zewail City of Science and Technology,Sheikh Zayed, 12588, Giza (Egypt); Department of Physics, Faculty of Science, Benha University,Benha, 13518 (Egypt)

    2014-06-16

    Inspired by Jacobson’s thermodynamic approach, Cai et al. have shown the emergence of Friedmann equations from the first law of thermodynamics. We extend Akbar-Cai derivation http://dx.doi.org/10.1103/PhysRevD.75.084003 of Friedmann equations to accommodate a general entropy-area law. Studying the resulted Friedmann equations using a specific entropy-area law, which is motivated by the generalized uncertainty principle (GUP), reveals the existence of a maximum energy density closed to Planck density. Allowing for a general continuous pressure p(ρ,a) leads to bounded curvature invariants and a general nonsingular evolution. In this case, the maximum energy density is reached in a finite time and there is no cosmological evolution beyond this point which leaves the big bang singularity inaccessible from a spacetime prospective. The existence of maximum energy density and a general nonsingular evolution is independent of the equation of state and the spacial curvature k. As an example we study the evolution of the equation of state p=ωρ through its phase-space diagram to show the existence of a maximum energy which is reachable in a finite time.

  15. On the spatial behavior of background plasma in different background pressure in CPS device

    International Nuclear Information System (INIS)

    Samantaray, Subrata; Paikaray, Rita; Sahoo, Gourishankar; Das, Parthasarathi; Ghosh, Joydeep; Sanyasi, Amulya Kumar

    2015-01-01

    Blob formation and transport is a major concern for investigators as it greatly reduces the efficiency of the devices. Initial results from CPS device confirm the role of fast neutrals inside the bulk plasma in the process of blob formation and transport. 2-D simulation of curvature and velocity shear instability in plasma structures suggest that in the presence of background plasma, secondary instability do not grow non-linearly to a high level and stabilizes the flow. Adiabaticity effect also creates a radial barrier for interchange modes. In the absence of background plasma the blob fragments even at the modest level of viscosity. The fast neutrals outside bulk plasma supposed to stabilize the system. The background plasma set up is aimed at creating fast neutrals outside main plasma column, hence; the background plasma set up is done in CPS device. The spatial behavior of plasma column in between electrodes is different for different base pressure in CPS device. The spatial variation of electron temperature of plasma column between electrodes is presented in this communication. Electron temperature is measured from emission spectroscopy data. The maximum electron temperature (line averaged) is ∼ 1.5 eV. (author)

  16. Generative electronic background music system

    Energy Technology Data Exchange (ETDEWEB)

    Mazurowski, Lukasz [Faculty of Computer Science, West Pomeranian University of Technology in Szczecin, Zolnierska Street 49, Szczecin, PL (Poland)

    2015-03-10

    In this short paper-extended abstract the new approach to generation of electronic background music has been presented. The Generative Electronic Background Music System (GEBMS) has been located between other related approaches within the musical algorithm positioning framework proposed by Woller et al. The music composition process is performed by a number of mini-models parameterized by further described properties. The mini-models generate fragments of musical patterns used in output composition. Musical pattern and output generation are controlled by container for the mini-models - a host-model. General mechanism has been presented including the example of the synthesized output compositions.

  17. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  18. Background music and cognitive performance.

    Science.gov (United States)

    Angel, Leslie A; Polzella, Donald J; Elvers, Greg C

    2010-06-01

    The present experiment employed standardized test batteries to assess the effects of fast-tempo music on cognitive performance among 56 male and female university students. A linguistic processing task and a spatial processing task were selected from the Criterion Task Set developed to assess verbal and nonverbal performance. Ten excerpts from Mozart's music matched for tempo were selected. Background music increased the speed of spatial processing and the accuracy of linguistic processing. The findings suggest that background music can have predictable effects on cognitive performance.

  19. Children of ethnic minority backgrounds

    DEFF Research Database (Denmark)

    Johansen, Stine Liv

    2010-01-01

    media products and toys just as they will have knowledge of different media texts, play genres, rhymes etc. This has consequences for their ability to access social settings, for instance in play. New research in this field will focus on how children themselves make sense of this balancing of cultures......Children of ethnic minority background balance their everyday life between a cultural background rooted in their ethnic origin and a daily life in day care, schools and with peers that is founded in a majority culture. This means, among other things, that they often will have access to different...

  20. Generative electronic background music system

    International Nuclear Information System (INIS)

    Mazurowski, Lukasz

    2015-01-01

    In this short paper-extended abstract the new approach to generation of electronic background music has been presented. The Generative Electronic Background Music System (GEBMS) has been located between other related approaches within the musical algorithm positioning framework proposed by Woller et al. The music composition process is performed by a number of mini-models parameterized by further described properties. The mini-models generate fragments of musical patterns used in output composition. Musical pattern and output generation are controlled by container for the mini-models - a host-model. General mechanism has been presented including the example of the synthesized output compositions

  1. Trends in Disability-Free Life Expectancy in Japan, 1995–2004

    OpenAIRE

    Hashimoto, Shuji; Kawado, Miyuki; Seko, Rumi; Murakami, Yoshitaka; Hayashi, Masayuki; Kato, Masahiro; Noda, Tatsuya; Ojima, Toshiyuki; Nagai, Masato; Tsuji, Ichiro

    2010-01-01

    Background In Japan, life expectancy at birth is currently the highest in the world. However, recent trends in disability-free life expectancy in Japan have not been examined. Methods We used data from Japanese national surveys for the period 1995–2004. These surveys included information on activity status measured by common self-reported instruments. The numbers of expected years with and without activity limitation were estimated by using the Sullivan method. Results The numbers of expected...

  2. Educational expectation trajectories and attainment in the transition to adulthood.

    Science.gov (United States)

    Johnson, Monica Kirkpatrick; Reynolds, John R

    2013-05-01

    How consequential is family socioeconomic status for maintaining plans to get a bachelor's degree during the transition to adulthood? This article examines persistence and change in educational expectations, focusing on the extent to which family socioeconomic status shapes overtime trajectories of bachelor's degree expectations, how the influence involves the timing of family formation and full-time work vs. college attendance, and how persistence in expectations is consequential for getting a 4-year degree. The findings, based on the high school senior classes of 1987-1990, demonstrate that adolescents from higher socioeconomic status families are much more likely to hold onto their expectations to earn 4-year degrees, both in the early years after high school and, for those who do not earn degrees within that period, on through their 20s. These more persistent expectations in young adulthood, more so than adolescent expectations, help explain the greater success of young people from higher socioeconomic status backgrounds in earning a 4-year degree. Persistence of expectations to earn a bachelor's degree in the years after high school is shaped by stratified pathways of school, work, and family roles in the transition to adulthood. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Educational Expectations and Media Cultures

    Directory of Open Access Journals (Sweden)

    Petra Missomelius

    2014-11-01

    Full Text Available This article investigates themedia-supported educational resources that arecurrently under discussion, such as OERs and MOOCs. Considering the discursive connection between these formats, which is couched in terms of educational freedom and openness, the article’sthesis is that these are expectations which are placed on the media technologies themselves, andthen transferred to learning scenarios. To this end, the article will pursue such questions as: What are the learners, learning materials and learning scenarios allegedly free from or free for? What obstructive configurations should be omitted? To what extent are these characteristics which are of a nature to guaranteelearning processes in the context of lifelong learning or can these characteristics better be attributed to the media technologies themselves and the ways in which they are used? What advantages or new accentuations are promised by proponents of theeducation supplied by media technology? Which discourses provide sustenance for such implied “post-typographic educational ideals” (Giesecke 2001 and Lemke 1998? The importance to learners, teachers and decision-makers at educational institutions of being well informed as far as media is concerned is becoming increasingly apparent.

  4. CMS: Beyond all possible expectations

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    After having retraced the entire Standard Model up to the Top, the CMS collaboration is ready to go further and continue the success of what Guido Tonelli – its spokesperson – defines as a ‘magic year’. Things evolve fast at CMS, but scientists have taken up the challenge and are ready for the future.   ‘Enthusiasm’ is the word that best describes the feeling one gets when talking to Guido Tonelli. “In just a few months we have rediscovered the Standard Model and have gone even further by producing new results for cross-sections, placing new limits on the creation of heavy masses, making studies on the excited states of quarks, and seeking new resonances. We could not have expected so much such a short space of time. It’s fantastic”, he says. “We went through the learning phase very smoothly. Our detector was very quickly ready to do real physics and we were able to start to produce results almost ...

  5. Direct maximum parsimony phylogeny reconstruction from genotype data

    Directory of Open Access Journals (Sweden)

    Ravi R

    2007-12-01

    Full Text Available Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes. Results In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes. Conclusion Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.

  6. [The psychosocial background of sterile patients].

    Science.gov (United States)

    Pusch, H H; Urdl, W; Walcher, W

    1989-01-01

    The psychosocial background of 300 childless couples from the Infertility Clinic of the Department of Gynecology and Obstetrics, University of Graz, was evaluated by means of a questionnaire and statistical analysis of data from their files. Points of special interest were problems such as interactions of the couple, motivations for the desire of children, psychosomatics, andrological investigation within the gynecological department, sexual habits and motivation and compliance concerning investigations and treatment. 72% of the questionnaires were returned. 50% of the sterile couples preferred to attend the infertility clinic together. 26% felt restrictions in their sexual behaviour due to the unrealized desire of children, 48% expected improvements in their partnership if they could have children. Compliance of male partners concerning the regular intake of prescribed medicaments was 83%, 63% accepted to stop smoking in cases of pathospermia.

  7. Low Background Micromegas in CAST

    DEFF Research Database (Denmark)

    Garza, J G; Aune, S.; Aznar, F.

    2014-01-01

    Solar axions could be converted into x-rays inside the strong magnetic field of an axion helioscope, triggering the detection of this elusive particle. Low background x-ray detectors are an essential component for the sensitivity of these searches. We report on the latest developments of the Micr...

  8. Teaching about Natural Background Radiation

    Science.gov (United States)

    Al-Azmi, Darwish; Karunakara, N.; Mustapha, Amidu O.

    2013-01-01

    Ambient gamma dose rates in air were measured at different locations (indoors and outdoors) to demonstrate the ubiquitous nature of natural background radiation in the environment and to show that levels vary from one location to another, depending on the underlying geology. The effect of a lead shield on a gamma radiation field was also…

  9. Educational Choice. A Background Paper.

    Science.gov (United States)

    Quality Education for Minorities Network, Washington, DC.

    This paper addresses school choice, one proposal to address parental involvement concerns, focusing on historical background, definitions, rationale for advocating choice, implementation strategies, and implications for minorities and low-income families. In the past, transfer payment programs such as tuition tax credits and vouchers were…

  10. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  11. Low Background Micromegas in CAST

    CERN Document Server

    Garza, J.G.; Aznar, F.; Calvet, D.; Castel, J.F.; Christensen, F.E.; Dafni, T.; Davenport, M.; Decker, T.; Ferrer-Ribas, E.; Galán, J.; García, J.A.; Giomataris, I.; Hill, R.M.; Iguaz, F.J.; Irastorza, I.G.; Jakobsen, A.C.; Jourde, D.; Mirallas, H.; Ortega, I.; Papaevangelou, T.; Pivovaroff, M.J.; Ruz, J.; Tomás, A.; Vafeiadis, T.; Vogel, J.K.

    2015-11-16

    Solar axions could be converted into x-rays inside the strong magnetic field of an axion helioscope, triggering the detection of this elusive particle. Low background x-ray detectors are an essential component for the sensitivity of these searches. We report on the latest developments of the Micromegas detectors for the CERN Axion Solar Telescope (CAST), including technological pathfinder activities for the future International Axion Observatory (IAXO). The use of low background techniques and the application of discrimination algorithms based on the high granularity of the readout have led to background levels below 10$^{-6}$ counts/keV/cm$^2$/s, more than a factor 100 lower than the first generation of Micromegas detectors. The best levels achieved at the Canfranc Underground Laboratory (LSC) are as low as 10$^{-7}$ counts/keV/cm$^2$/s, showing good prospects for the application of this technology in IAXO. The current background model, based on underground and surface measurements, is presented, as well as ...

  12. 302 Historical Background, Development and Standard of Public ...

    African Journals Online (AJOL)

    User

    Abstract. It has been observed that public libraries in Nigeria have not developed as expected. Instead of moving forward, they are still very backward in terms of development. This paper examines the historical background development and standard of public libraries services in Nigeria. It looks at the roles and the sources ...

  13. Response Expectancy and the Placebo Effect.

    Science.gov (United States)

    Kirsch, Irving

    2018-01-01

    In this chapter, I review basic tenets of response expectancy theory (Kirsch, 1985), beginning with the important distinction between response expectancies and stimulus expectancies. Although both can affect experience, the effects of response expectancies are stronger and more resistant to extinction than those of stimulus expectancies. Further, response expectancies are especially important to understanding placebo effects. The response expectancy framework is consistent with and has been amplified by the Bayesian model of predictive coding. Clinical implications of these phenomena are exemplified. © 2018 Elsevier Inc. All rights reserved.

  14. Probing Inflation via Cosmic Microwave Background Polarimetry

    Science.gov (United States)

    Chuss, David T.

    2008-01-01

    The Cosmic Microwave Background (CMB) has been a rich source of information about the early Universe. Detailed measurements of its spectrum and spatial distribution have helped solidify the Standard Model of Cosmology. However, many questions still remain. Standard Cosmology does not explain why the early Universe is geometrically flat, expanding, homogenous across the horizon, and riddled with a small anisotropy that provides the seed for structure formation. Inflation has been proposed as a mechanism that naturally solves these problems. In addition to solving these problems, inflation is expected to produce a spectrum of gravitational waves that will create a particular polarization pattern on the CMB. Detection of this polarized signal is a key test of inflation and will give a direct measurement of the energy scale at which inflation takes place. This polarized signature of inflation is expected to be -9 orders of magnitude below the 2.7 K monopole level of the CMB. This measurement will require good control of systematic errors, an array of many detectors having the requisite sensitivity, and a reliable method for removing polarized foregrounds, and nearly complete sky coverage. Ultimately, this measurement is likely to require a space mission. To this effect, technology and mission concept development are currently underway.

  15. Macroeconomic Expectations of Households and Professional Forecasters

    OpenAIRE

    Christopher D Carroll

    2002-01-01

    Economists have long emphasized the importance of expectations in determining macroeconomic outcomes Yet there has been almost no recent effort to model actual empirical expectations data; instead macroeconomists usually simply assume expectations are rational This paper shows that while empirical household expectations are not rational in the usual sense expectational dynamics are well captured by a model in which households' views derive from news reports of the views of professional foreca...

  16. Maximum concentrations at work and maximum biologically tolerable concentration for working materials 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The meaning of the term 'maximum concentration at work' in regard of various pollutants is discussed. Specifically, a number of dusts and smokes are dealt with. The valuation criteria for maximum biologically tolerable concentrations for working materials are indicated. The working materials in question are corcinogeneous substances or substances liable to cause allergies or mutate the genome. (VT) [de

  17. 75 FR 43840 - Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum Civil Monetary Penalties for...

    Science.gov (United States)

    2010-07-27

    ...-17530; Notice No. 2] RIN 2130-ZA03 Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum... remains at $250. These adjustments are required by the Federal Civil Penalties Inflation Adjustment Act [email protected] . SUPPLEMENTARY INFORMATION: The Federal Civil Penalties Inflation Adjustment Act of 1990...

  18. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history

    OpenAIRE

    Cherry, Joshua L.

    2017-01-01

    Background Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Results Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data....

  19. Zipf's law, power laws and maximum entropy

    International Nuclear Information System (INIS)

    Visser, Matt

    2013-01-01

    Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified. (paper)

  20. Maximum-entropy description of animal movement.

    Science.gov (United States)

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  1. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  2. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  3. A Maximum Radius for Habitable Planets.

    Science.gov (United States)

    Alibert, Yann

    2015-09-01

    We compute the maximum radius a planet can have in order to fulfill two constraints that are likely necessary conditions for habitability: 1- surface temperature and pressure compatible with the existence of liquid water, and 2- no ice layer at the bottom of a putative global ocean, that would prevent the operation of the geologic carbon cycle to operate. We demonstrate that, above a given radius, these two constraints cannot be met: in the Super-Earth mass range (1-12 Mearth), the overall maximum that a planet can have varies between 1.8 and 2.3 Rearth. This radius is reduced when considering planets with higher Fe/Si ratios, and taking into account irradiation effects on the structure of the gas envelope.

  4. Maximum parsimony on subsets of taxa.

    Science.gov (United States)

    Fischer, Mareike; Thatte, Bhalchandra D

    2009-09-21

    In this paper we investigate mathematical questions concerning the reliability (reconstruction accuracy) of Fitch's maximum parsimony algorithm for reconstructing the ancestral state given a phylogenetic tree and a character. In particular, we consider the question whether the maximum parsimony method applied to a subset of taxa can reconstruct the ancestral state of the root more accurately than when applied to all taxa, and we give an example showing that this indeed is possible. A surprising feature of our example is that ignoring a taxon closer to the root improves the reliability of the method. On the other hand, in the case of the two-state symmetric substitution model, we answer affirmatively a conjecture of Li, Steel and Zhang which states that under a molecular clock the probability that the state at a single taxon is a correct guess of the ancestral state is a lower bound on the reconstruction accuracy of Fitch's method applied to all taxa.

  5. Maximum entropy analysis of liquid diffraction data

    International Nuclear Information System (INIS)

    Root, J.H.; Egelstaff, P.A.; Nickel, B.G.

    1986-01-01

    A maximum entropy method for reducing truncation effects in the inverse Fourier transform of structure factor, S(q), to pair correlation function, g(r), is described. The advantages and limitations of the method are explored with the PY hard sphere structure factor as model input data. An example using real data on liquid chlorine, is then presented. It is seen that spurious structure is greatly reduced in comparison to traditional Fourier transform methods. (author)

  6. Radioactivity backgrounds in ZEPLIN-III

    Science.gov (United States)

    Araújo, H. M.; Akimov, D. Yu.; Barnes, E. J.; Belov, V. A.; Bewick, A.; Burenkov, A. A.; Chepel, V.; Currie, A.; Deviveiros, L.; Edwards, B.; Ghag, C.; Hollingsworth, A.; Horn, M.; Kalmus, G. E.; Kobyakin, A. S.; Kovalenko, A. G.; Lebedenko, V. N.; Lindote, A.; Lopes, M. I.; Lüscher, R.; Majewski, P.; Murphy, A. St. J.; Neves, F.; Paling, S. M.; Pinto da Cunha, J.; Preece, R.; Quenby, J. J.; Reichhart, L.; Scovell, P. R.; Silva, C.; Solovov, V. N.; Smith, N. J. T.; Smith, P. F.; Stekhanov, V. N.; Sumner, T. J.; Thorne, C.; Walker, R. J.

    2012-03-01

    We examine electron and nuclear recoil backgrounds from radioactivity in the ZEPLIN-III dark matter experiment at Boulby. The rate of low-energy electron recoils in the liquid xenon WIMP target is 0.75 ± 0.05 events/kg/day/keV, which represents a 20-fold improvement over the rate observed during the first science run. Energy and spatial distributions agree with those predicted by component-level Monte Carlo simulations propagating the effects of the radiological contamination measured for materials employed in the experiment. Neutron elastic scattering is predicted to yield 3.05 ± 0.5 nuclear recoils with energy 5-50 keV per year, which translates to an expectation of 0.4 events in a 1 yr dataset in anti-coincidence with the veto detector for realistic signal acceptance. Less obvious background sources are discussed, especially in the context of future experiments. These include contamination of scintillation pulses with Cherenkov light from Compton electrons and from β activity internal to photomultipliers, which can increase the size and lower the apparent time constant of the scintillation response. Another challenge is posed by multiple-scatter γ-rays with one or more vertices in regions that yield no ionisation. If the discrimination power achieved in the first run can be replicated, ZEPLIN-III should reach a sensitivity of ˜1 × 10-8pb · yr to the scalar WIMP-nucleon elastic cross-section, as originally conceived.

  7. Particle production in a gravitational wave background

    Science.gov (United States)

    Jones, Preston; McDougall, Patrick; Singleton, Douglas

    2017-03-01

    We study the possibility that massless particles, such as photons, are produced by a gravitational wave. That such a process should occur is implied by tree-level Feynman diagrams such as two gravitons turning into two photons, i.e., g +g →γ +γ . Here we calculate the rate at which a gravitational wave creates a massless scalar field. This is done by placing the scalar field in the background of a plane gravitational wave and calculating the 4-current of the scalar field. Even in the vacuum limit of the scalar field it has a nonzero vacuum expectation value (similar to what occurs in the Higgs mechanism) and a nonzero current. We associate this with the production of scalar field quanta by the gravitational field. This effect has potential consequences for the attenuation of gravitational waves since the massless field is being produced at the expense of the gravitational field. This is related to the time-dependent Schwinger effect, but with the electric field replaced by the gravitational wave background and the electron/positron field quanta replaced by massless scalar "photons." Since the produced scalar quanta are massless there is no exponential suppression, as occurs in the Schwinger effect due to the electron mass.

  8. Comparison of Speech Perception in Background Noise with Acceptance of Background Noise in Aided and Unaided Conditions.

    Science.gov (United States)

    Nabelek, Anna K.; Tampas, Joanna W.; Burchfield, Samuel B.

    2004-01-01

    l, speech perception in noiseBackground noise is a significant factor influencing hearing-aid satisfaction and is a major reason for rejection of hearing aids. Attempts have been made by previous researchers to relate the use of hearing aids to speech perception in noise (SPIN), with an expectation of improved speech perception followed by an…

  9. A Maximum Resonant Set of Polyomino Graphs

    Directory of Open Access Journals (Sweden)

    Zhang Heping

    2016-05-01

    Full Text Available A polyomino graph P is a connected finite subgraph of the infinite plane grid such that each finite face is surrounded by a regular square of side length one and each edge belongs to at least one square. A dimer covering of P corresponds to a perfect matching. Different dimer coverings can interact via an alternating cycle (or square with respect to them. A set of disjoint squares of P is a resonant set if P has a perfect matching M so that each one of those squares is M-alternating. In this paper, we show that if K is a maximum resonant set of P, then P − K has a unique perfect matching. We further prove that the maximum forcing number of a polyomino graph is equal to the cardinality of a maximum resonant set. This confirms a conjecture of Xu et al. [26]. We also show that if K is a maximal alternating set of P, then P − K has a unique perfect matching.

  10. Automatic maximum entropy spectral reconstruction in NMR

    International Nuclear Information System (INIS)

    Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.

    2007-01-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system

  11. maximum neutron flux at thermal nuclear reactors

    International Nuclear Information System (INIS)

    Strugar, P.

    1968-10-01

    Since actual research reactors are technically complicated and expensive facilities it is important to achieve savings by appropriate reactor lattice configurations. There is a number of papers, and practical examples of reactors with central reflector, dealing with spatial distribution of fuel elements which would result in higher neutron flux. Common disadvantage of all the solutions is that the choice of best solution is done starting from the anticipated spatial distributions of fuel elements. The weakness of these approaches is lack of defined optimization criteria. Direct approach is defined as follows: determine the spatial distribution of fuel concentration starting from the condition of maximum neutron flux by fulfilling the thermal constraints. Thus the problem of determining the maximum neutron flux is solving a variational problem which is beyond the possibilities of classical variational calculation. This variational problem has been successfully solved by applying the maximum principle of Pontrjagin. Optimum distribution of fuel concentration was obtained in explicit analytical form. Thus, spatial distribution of the neutron flux and critical dimensions of quite complex reactor system are calculated in a relatively simple way. In addition to the fact that the results are innovative this approach is interesting because of the optimization procedure itself [sr

  12. Parameter estimation via conditional expectation: a Bayesian inversion

    KAUST Repository

    Matthies, Hermann G.; Zander, Elmar; Rosić, Bojana V.; Litvinenko, Alexander

    2016-01-01

    When a mathematical or computational model is used to analyse some system, it is usual that some parameters resp. functions or fields in the model are not known, and hence uncertain. These parametric quantities are then identified by actual observations of the response of the real system. In a probabilistic setting, Bayes’s theory is the proper mathematical background for this identification process. The possibility of being able to compute a conditional expectation turns out to be crucial for this purpose. We show how this theoretical background can be used in an actual numerical procedure, and shortly discuss various numerical approximations.

  13. Parameter estimation via conditional expectation: a Bayesian inversion

    KAUST Repository

    Matthies, Hermann G.

    2016-08-11

    When a mathematical or computational model is used to analyse some system, it is usual that some parameters resp. functions or fields in the model are not known, and hence uncertain. These parametric quantities are then identified by actual observations of the response of the real system. In a probabilistic setting, Bayes’s theory is the proper mathematical background for this identification process. The possibility of being able to compute a conditional expectation turns out to be crucial for this purpose. We show how this theoretical background can be used in an actual numerical procedure, and shortly discuss various numerical approximations.

  14. Considerations on the establishment of maximum permissible exposure of man

    International Nuclear Information System (INIS)

    Jacobi, W.

    1974-01-01

    An attempt is made in the information lecture to give a quantitative analysis of the somatic radiation risk and to illustrate a concept to fix dose limiting values. Of primary importance is the limiting values. Of primary importance is the limiting value of the radiation exposure to the whole population. By consequential application of the risk concept, the following points are considered: 1) Definition of the risk for radiation late damages (cancer, leukemia); 2) relationship between radiation dose and thus caused radiation risk; 3) radiation risk and the dose limiting values at the time; 4) criteria for the maximum acceptable radiation risk; 5) limiting value which can be expected at the time. (HP/LH) [de

  15. Configuration of LWR fuel enrichment or burnup yielding maximum power

    International Nuclear Information System (INIS)

    Bartosek, V.; Zalesky, K.

    1976-01-01

    An analysis is given of the spatial distribution of fuel burnup and enrichment in a light-water lattice of given dimensions with slightly enriched uranium, at which the maximum output is achieved. It is based on the spatial solution of neutron flux using a one-group diffusion model in which linear dependence may be expected of the fission cross section and the material buckling parameter on the fuel burnup and enrichment. Two problem constraints are considered, i.e., the neutron flux value and the specific output value. For the former the optimum core configuration remains qualitatively unchanged for any reflector thickness, for the latter the cases of a reactor with and without reflector must be distinguished. (Z.M.)

  16. Evolving expectations from international organisations

    International Nuclear Information System (INIS)

    Ruiz Lopez, C.

    2008-01-01

    The author stated that implementation of the geological disposal concept requires a strategy that provides national decision makers with sufficient confidence in the level of long-term safety and protection ultimately achieved. The concept of protection against harm has a broader meaning than radiological protection in terms of risk and dose. It includes the protection of the environment and socio-economic interests of communities. She recognised that a number of countries have established regulatory criteria already, and others are now discussing what constitutes a proper regulatory test and suitable time frame for judging the safety of long-term disposal. Each regulatory programme seeks to define reasonable tests of repository performance, using protection criteria and safety approaches consistent with the culture, values and expectations of the citizens of the country concerned. This means that there are differences in how protection and safety are addressed in national approaches to regulation and in the bases used for that. However, as was recognised in the Cordoba Workshop, it would be important to reach a minimum level of consistency and be able to explain the differences. C. Ruiz-Lopez presented an overview of the development of international guidance from ICRP, IAEA and NEA from the Cordoba workshop up to now, and positions of independent National Advisory Bodies. The evolution of these guidelines over time demonstrates an evolving understanding of long-term implications, with the recognition that dose and risk constraints should not be seen as measures of detriment beyond a few hundred years, the emphasis on sound engineering practices, and the introduction of new concepts and approaches which take into account social and economical aspects (e.g. constrained optimisation, BAT, managerial principles). In its new recommendations, ICRP (draft 2006) recognizes. in particular, that decision making processes may depend on other societal concerns and considers

  17. Background radioactivity in environmental materials

    International Nuclear Information System (INIS)

    Maul, P.R.; O'Hara, J.P.

    1989-01-01

    This paper presents the results of a literature search to identify information on concentrations of 'background' radioactivity in foodstuffs and other commonly available environmental materials. The review has concentrated on naturally occurring radioactivity in foods and on UK data, although results from other countries have also been considered where appropriate. The data are compared with established definitions of a 'radioactive' substance and radionuclides which do not appear to be adequately covered in the literature are noted. (author)

  18. Background paper on aquaculture research

    OpenAIRE

    Wenblad, Axel; Jokumsen, Alfred; Eskelinen, Unto; Torrissen, Ole

    2013-01-01

    The Board of MISTRA established in 2012 a Working Group (WG) on Aquaculture to provide the Board with background information for its upcoming decision on whether the foundation should invest in aquaculture research. The WG included Senior Advisor Axel Wenblad, Sweden (Chairman), Professor Ole Torrissen, Norway, Senior Advisory Scientist Unto Eskelinen, Finland and Senior Advisory Scientist Alfred Jokumsen, Denmark. The WG performed an investigation of the Swedish aquaculture sector including ...

  19. The isotropic radio background revisited

    Energy Technology Data Exchange (ETDEWEB)

    Fornengo, Nicolao; Regis, Marco [Dipartimento di Fisica Teorica, Università di Torino, via P. Giuria 1, I–10125 Torino (Italy); Lineros, Roberto A. [Instituto de Física Corpuscular – CSIC/U. Valencia, Parc Científic, calle Catedrático José Beltrán, 2, E-46980 Paterna (Spain); Taoso, Marco, E-mail: fornengo@to.infn.it, E-mail: rlineros@ific.uv.es, E-mail: regis@to.infn.it, E-mail: taoso@cea.fr [Institut de Physique Théorique, CEA/Saclay, F-91191 Gif-sur-Yvette Cédex (France)

    2014-04-01

    We present an extensive analysis on the determination of the isotropic radio background. We consider six different radio maps, ranging from 22 MHz to 2.3 GHz and covering a large fraction of the sky. The large scale emission is modeled as a linear combination of an isotropic component plus the Galactic synchrotron radiation and thermal bremsstrahlung. Point-like and extended sources are either masked or accounted for by means of a template. We find a robust estimate of the isotropic radio background, with limited scatter among different Galactic models. The level of the isotropic background lies significantly above the contribution obtained by integrating the number counts of observed extragalactic sources. Since the isotropic component dominates at high latitudes, thus making the profile of the total emission flat, a Galactic origin for such excess appears unlikely. We conclude that, unless a systematic offset is present in the maps, and provided that our current understanding of the Galactic synchrotron emission is reasonable, extragalactic sources well below the current experimental threshold seem to account for the majority of the brightness of the extragalactic radio sky.

  20. The isotropic radio background revisited

    International Nuclear Information System (INIS)

    Fornengo, Nicolao; Regis, Marco; Lineros, Roberto A.; Taoso, Marco

    2014-01-01

    We present an extensive analysis on the determination of the isotropic radio background. We consider six different radio maps, ranging from 22 MHz to 2.3 GHz and covering a large fraction of the sky. The large scale emission is modeled as a linear combination of an isotropic component plus the Galactic synchrotron radiation and thermal bremsstrahlung. Point-like and extended sources are either masked or accounted for by means of a template. We find a robust estimate of the isotropic radio background, with limited scatter among different Galactic models. The level of the isotropic background lies significantly above the contribution obtained by integrating the number counts of observed extragalactic sources. Since the isotropic component dominates at high latitudes, thus making the profile of the total emission flat, a Galactic origin for such excess appears unlikely. We conclude that, unless a systematic offset is present in the maps, and provided that our current understanding of the Galactic synchrotron emission is reasonable, extragalactic sources well below the current experimental threshold seem to account for the majority of the brightness of the extragalactic radio sky

  1. Maximum mass ratio of AM CVn-type binary systems and maximum white dwarf mass in ultra-compact X-ray binaries

    Directory of Open Access Journals (Sweden)

    Arbutina Bojan

    2011-01-01

    Full Text Available AM CVn-type stars and ultra-compact X-ray binaries are extremely interesting semi-detached close binary systems in which the Roche lobe filling component is a white dwarf transferring mass to another white dwarf, neutron star or a black hole. Earlier theoretical considerations show that there is a maximum mass ratio of AM CVn-type binary systems (qmax ≈ 2/3 below which the mass transfer is stable. In this paper we derive slightly different value for qmax and more interestingly, by applying the same procedure, we find the maximum expected white dwarf mass in ultra-compact X-ray binaries.

  2. Multicultural Differences in Women's Expectations of Birth.

    Science.gov (United States)

    Moore, Marianne F

    2016-01-01

    This review surveyed qualitative and quantitative studies to explore the expectations around birth that are held by women from different cultures. These studies are grouped according to expectations of personal control expectations of support from partner/others/family; expectations of carel behavior from providers such as nurses, doctors, and/or midwives; expectations about the health of the baby; and expectations about pain in childbirth. Discussed are the findings and the role that Western culture in medicine, power and privilege are noted in providing care to these women.

  3. Demonstration of Cosmic Microwave Background Delensing Using the Cosmic Infrared Background.

    Science.gov (United States)

    Larsen, Patricia; Challinor, Anthony; Sherwin, Blake D; Mak, Daisy

    2016-10-07

    Delensing is an increasingly important technique to reverse the gravitational lensing of the cosmic microwave background (CMB) and thus reveal primordial signals the lensing may obscure. We present a first demonstration of delensing on Planck temperature maps using the cosmic infrared background (CIB). Reversing the lensing deflections in Planck CMB temperature maps using a linear combination of the 545 and 857 GHz maps as a lensing tracer, we find that the lensing effects in the temperature power spectrum are reduced in a manner consistent with theoretical expectations. In particular, the characteristic sharpening of the acoustic peaks of the temperature power spectrum resulting from successful delensing is detected at a significance of 16σ, with an amplitude of A_{delens}=1.12±0.07 relative to the expected value of unity. This first demonstration on data of CIB delensing, and of delensing techniques in general, is significant because lensing removal will soon be essential for achieving high-precision constraints on inflationary B-mode polarization.

  4. The response of the southern Greenland ice sheet to the Holocene thermal maximum

    DEFF Research Database (Denmark)

    Larsen, Nicolaj Krog; Kjaer, Kurt H.; Lecavalier, Benoit

    2015-01-01

    contribution of 0.16 m sea-level equivalent from the entire Greenland ice sheet, with a centennial ice loss rate of as much as 100 Gt/yr for several millennia during the Holocene thermal maximum. Our results provide an estimate of the long-term rates of volume loss that can be expected in the future...

  5. IRT Item Parameter Recovery with Marginal Maximum Likelihood Estimation Using Loglinear Smoothing Models

    Science.gov (United States)

    Casabianca, Jodi M.; Lewis, Charles

    2015-01-01

    Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…

  6. Estimation of Cosmic Induced Contamination in Ultra-low Background Detector Materials

    Energy Technology Data Exchange (ETDEWEB)

    Aguayo Navarrete, Estanislao; Kouzes, Richard T.; Orrell, John L.; Berguson, Timothy J.; Greene, Austen T.

    2012-08-01

    Executive Summary This document presents the result of investigating a way to reliably determine cosmic induced backgrounds for ultra-low background materials. In particular, it focuses on those radioisotopes produced by the interactions with cosmic ray particles in the detector materials that act as a background for experiments looking for neutrinoless double beta decay. This investigation is motivated by the desire to determine background contributions from cosmic ray activation of the electroformed copper that is being used in the construction of the MAJORANA DEMONSTRATOR. The most important radioisotope produced in copper that contributes to the background budget is 60Co, which has the potential to deposit energy in the region of interest of this experiment. Cobalt-60 is produced via cosmic ray neutron collisions in the copper. This investigation aims to provide a method for determining whether or not the copper has been exposed to cosmic radiation beyond the threshold which the Majorana Project has established as the maximum exposure. This threshold is set by the Project as the expected contribution of this source of background to the overall background budget. One way to estimate cosmic ray neutron exposure of materials on the surface of the Earth is to relate it to the cosmic ray muon exposure. Muons are minimum-ionizing particles and the available technologies to detect muons are easier to implement than those to detect neutrons. We present the results of using a portable, ruggedized muon detector, the µ-Witness made by our research group, for determination of muon exposure of materials for the MAJORANA DEMONSTRATOR. From the muon flux measurement, this report presents a method to estimate equivalent sea-level exposure, and then infer the neutron exposure of the tracked material and thus the cosmogenic activation of the copper. This report combines measurements of the muon flux taken by the µ-Witness detector with Geant4 simulations in order to assure our

  7. Maximum entropy decomposition of quadrupole mass spectra

    International Nuclear Information System (INIS)

    Toussaint, U. von; Dose, V.; Golan, A.

    2004-01-01

    We present an information-theoretic method called generalized maximum entropy (GME) for decomposing mass spectra of gas mixtures from noisy measurements. In this GME approach to the noisy, underdetermined inverse problem, the joint entropies of concentration, cracking, and noise probabilities are maximized subject to the measured data. This provides a robust estimation for the unknown cracking patterns and the concentrations of the contributing molecules. The method is applied to mass spectroscopic data of hydrocarbons, and the estimates are compared with those received from a Bayesian approach. We show that the GME method is efficient and is computationally fast

  8. Maximum power operation of interacting molecular motors

    DEFF Research Database (Denmark)

    Golubeva, Natalia; Imparato, Alberto

    2013-01-01

    , as compared to the non-interacting system, in a wide range of biologically compatible scenarios. We furthermore consider the case where the motor-motor interaction directly affects the internal chemical cycle and investigate the effect on the system dynamics and thermodynamics.......We study the mechanical and thermodynamic properties of different traffic models for kinesin which are relevant in biological and experimental contexts. We find that motor-motor interactions play a fundamental role by enhancing the thermodynamic efficiency at maximum power of the motors...

  9. Maximum entropy method in momentum density reconstruction

    International Nuclear Information System (INIS)

    Dobrzynski, L.; Holas, A.

    1997-01-01

    The Maximum Entropy Method (MEM) is applied to the reconstruction of the 3-dimensional electron momentum density distributions observed through the set of Compton profiles measured along various crystallographic directions. It is shown that the reconstruction of electron momentum density may be reliably carried out with the aid of simple iterative algorithm suggested originally by Collins. A number of distributions has been simulated in order to check the performance of MEM. It is shown that MEM can be recommended as a model-free approach. (author). 13 refs, 1 fig

  10. On the maximum drawdown during speculative bubbles

    Science.gov (United States)

    Rotundo, Giulia; Navarra, Mauro

    2007-08-01

    A taxonomy of large financial crashes proposed in the literature locates the burst of speculative bubbles due to endogenous causes in the framework of extreme stock market crashes, defined as falls of market prices that are outlier with respect to the bulk of drawdown price movement distribution. This paper goes on deeper in the analysis providing a further characterization of the rising part of such selected bubbles through the examination of drawdown and maximum drawdown movement of indices prices. The analysis of drawdown duration is also performed and it is the core of the risk measure estimated here.

  11. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  12. Conductivity maximum in a charged colloidal suspension

    Energy Technology Data Exchange (ETDEWEB)

    Bastea, S

    2009-01-27

    Molecular dynamics simulations of a charged colloidal suspension in the salt-free regime show that the system exhibits an electrical conductivity maximum as a function of colloid charge. We attribute this behavior to two main competing effects: colloid effective charge saturation due to counterion 'condensation' and diffusion slowdown due to the relaxation effect. In agreement with previous observations, we also find that the effective transported charge is larger than the one determined by the Stern layer and suggest that it corresponds to the boundary fluid layer at the surface of the colloidal particles.

  13. Dynamical maximum entropy approach to flocking.

    Science.gov (United States)

    Cavagna, Andrea; Giardina, Irene; Ginelli, Francesco; Mora, Thierry; Piovani, Duccio; Tavarone, Raffaele; Walczak, Aleksandra M

    2014-04-01

    We derive a new method to infer from data the out-of-equilibrium alignment dynamics of collectively moving animal groups, by considering the maximum entropy model distribution consistent with temporal and spatial correlations of flight direction. When bird neighborhoods evolve rapidly, this dynamical inference correctly learns the parameters of the model, while a static one relying only on the spatial correlations fails. When neighbors change slowly and the detailed balance is satisfied, we recover the static procedure. We demonstrate the validity of the method on simulated data. The approach is applicable to other systems of active matter.

  14. Maximum Temperature Detection System for Integrated Circuits

    Science.gov (United States)

    Frankiewicz, Maciej; Kos, Andrzej

    2015-03-01

    The paper describes structure and measurement results of the system detecting present maximum temperature on the surface of an integrated circuit. The system consists of the set of proportional to absolute temperature sensors, temperature processing path and a digital part designed in VHDL. Analogue parts of the circuit where designed with full-custom technique. The system is a part of temperature-controlled oscillator circuit - a power management system based on dynamic frequency scaling method. The oscillator cooperates with microprocessor dedicated for thermal experiments. The whole system is implemented in UMC CMOS 0.18 μm (1.8 V) technology.

  15. Maximum entropy PDF projection: A review

    Science.gov (United States)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  16. Maximum a posteriori decoder for digital communications

    Science.gov (United States)

    Altes, Richard A. (Inventor)

    1997-01-01

    A system and method for decoding by identification of the most likely phase coded signal corresponding to received data. The present invention has particular application to communication with signals that experience spurious random phase perturbations. The generalized estimator-correlator uses a maximum a posteriori (MAP) estimator to generate phase estimates for correlation with incoming data samples and for correlation with mean phases indicative of unique hypothesized signals. The result is a MAP likelihood statistic for each hypothesized transmission, wherein the highest value statistic identifies the transmitted signal.

  17. Improved Maximum Parsimony Models for Phylogenetic Networks.

    Science.gov (United States)

    Van Iersel, Leo; Jones, Mark; Scornavacca, Celine

    2018-05-01

    Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.

  18. Ancestral sequence reconstruction with Maximum Parsimony

    OpenAIRE

    Herbst, Lina; Fischer, Mareike

    2017-01-01

    One of the main aims in phylogenetics is the estimation of ancestral sequences based on present-day data like, for instance, DNA alignments. One way to estimate the data of the last common ancestor of a given set of species is to first reconstruct a phylogenetic tree with some tree inference method and then to use some method of ancestral state inference based on that tree. One of the best-known methods both for tree inference as well as for ancestral sequence inference is Maximum Parsimony (...

  19. Mortality and life expectancy in persons with severe unipolar depression

    DEFF Research Database (Denmark)

    Laursen, Thomas Munk; Musliner, Katherine L; Benros, Michael E

    2016-01-01

    BACKGROUND: Depression is a common psychiatric disorder, with a lifetime prevalence of 10-15% in the Danish population. Although depression is associated with excess mortality, it is not yet understood how this affects life expectancy. Our aim was to examine mortality rates and life expectancy...... in patients with unipolar depression compared to the general population, and to assess the impact of comorbid somatic illness and substance abuse. METHODS: We followed a Danish population-based cohort from 1995-2013 (N=5,103,699). The cohort included all residents in Denmark during the study period. Mortality...... rate ratios (MRRs) and life expectancy in persons with unipolar depression were calculated using survival analysis techniques. RESULTS: The overall MRR was 2.07 (95% Confidence Interval (CI): 2.05-2.09) in people with a previous unipolar depression diagnosis compared to the general Danish population...

  20. Gender and ethnic health disparities among the elderly in rural Guangxi, China: estimating quality-adjusted life expectancy

    Directory of Open Access Journals (Sweden)

    Tai Zhang

    2016-11-01

    Full Text Available Background: Ethnic health inequalities for males and females among the elderly have not yet been verified in multicultural societies in developing countries. The aim of this study was to assess the extent of disparities in health expectancy among the elderly from different ethnic groups using quality-adjusted life expectancy. Design: A cross-sectional community-based survey was conducted. A total of 6,511 rural elderly individuals aged ≥60 years were selected from eight different ethnic groups in the Guangxi Zhuang Autonomous Region of China and assessed for health-related quality of life (HRQoL. The HRQoL utility value was combined with life expectancy at age 60 years (LE60 data by using Sullivan's method to estimate quality-adjusted life expectancy at age 60 years (QALE60 and loss in quality-adjusted life years (QALYs for each group. Results: Overall, LE60 and QALE60 for all ethnic groups were 20.9 and 18.0 years in men, respectively, and 24.2 and 20.3 years in women. The maximum gap in QALE60 between ethnic groups was 3.3 years in males and 4.6 years in females. The average loss in QALY was 2.9 years for men and 3.8 years for women. The correlation coefficient between LE60 and QALY lost was −0.53 in males and 0.12 in females. Conclusion: Women live longer than men, but they suffer more; men have a shorter life expectancy, but those who live longer are healthier. Attempts should be made to reduce suffering in the female elderly and improve longevity for men. Certain ethnic groups had low levels of QALE, needing special attention to improve their lifestyle and access to health care.

  1. Upper Limits on the Stochastic Gravitational-Wave Background from Advanced LIGO’s First Observing Run

    OpenAIRE

    Abbott, B. P.; Abbott, R.; Adhikari, R. X.; Ananyeva, A.; Anderson, S. B.; Appert, S.; Arai, K.; Araya, M. C.; Barayoga, J. C.; Barish, B. C.; Berger, B. K.; Billingsley, G.; Biscans, S; Blackburn, J. K.; Bork, R.

    2017-01-01

    A wide variety of astrophysical and cosmological sources are expected to contribute to a stochastic gravitational-wave background. Following the observations of GW150914 and GW151226, the rate and mass of coalescing binary black holes appear to be greater than many previous expectations. As a result, the stochastic background from unresolved compact binary coalescences is expected to be particularly loud. We perform a search for the isotropic stochastic gravitational-wave background using dat...

  2. Family Background and Educational Choices

    DEFF Research Database (Denmark)

    McIntosh, James; D. Munk, Martin

    enrollments, especially for females. Not only did the educational opportunities for individuals with disadvantaged backgrounds improve absolutely, but their relative position also improved. A similarly dramatic increase in attendance at university for the period 1985-2005 was found for these cohorts when......We examine the participation in secondary and tertiary education of five cohorts of Danish males and females who were aged twenty starting in 1982 and ending in 2002. We find that the large expansion of secondary education in this period was characterized by a phenomenal increase in gymnasium...

  3. Consumers' Attitudes and Their Inflation Expectations

    DEFF Research Database (Denmark)

    Ehrmann, Michael; Pfajfar, Damjan; Santoro, Emiliano

    2017-01-01

    situation, their purchasing attitudes, and their expectations about the macroeconomy. Respondents with current or expected financial difficulties and those with pessimistic attitudes about major purchases, income developments, or unemployment have a stronger upward bias than other households. However...

  4. Experiments on Expectations in Macroeconomics and Finance

    NARCIS (Netherlands)

    Assenza, Tiziana; Bao, Te; Hommes, Cars; Massaro, Domenico; Duffy, John

    Expectations play a crucial role in finance, macroeconomics, monetary economics, and fiscal policy. In the last decade a rapidly increasing number of laboratory experiments have been performed to study individual expectation formation, the interactions of individual forecasting rules, and the

  5. Interest rate rules with heterogeneous expectations

    NARCIS (Netherlands)

    Anufriev, M.; Assenza, T.; Hommes, C.; Massaro, D.

    2011-01-01

    The recent macroeconomic literature stresses the importance of managing heterogeneous expectations in the formulation of monetary policy. We use a simple frictionless DSGE model to investigate inflation dynamics under alternative interest rate rules when agents have heterogeneous expectations and

  6. Bootstrap-based Support of HGT Inferred by Maximum Parsimony

    Directory of Open Access Journals (Sweden)

    Nakhleh Luay

    2010-05-01

    Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.

  7. Exposing therapists to trauma-focused treatment in psychosis: effects on credibility, expected burden, and harm expectancies

    Directory of Open Access Journals (Sweden)

    David P. G. van den Berg

    2016-09-01

    Full Text Available Background: Despite robust empirical support for the efficacy of trauma-focused treatments, the dissemination proves difficult, especially in relation to patients with comorbid psychosis. Many therapists endorse negative beliefs about the credibility, burden, and harm of such treatment. Objective: This feasibility study explores the impact of specialized training on therapists’ beliefs about trauma-focused treatment within a randomized controlled trial. Method: Therapist-rated (n=16 credibility, expected burden, and harm expectancies of trauma-focused treatment were assessed at baseline, post-theoretical training, post-technical training, post-supervised practical training, and at 2-year follow-up. Credibility and burden beliefs of therapists concerning the treatment of every specific patient in the trial were also assessed. Results: Over time, therapist-rated credibility of trauma-focused treatment showed a significant increase, whereas therapists’ expected burden and harm expectancies decreased significantly. In treating posttraumatic stress disorder (PTSD in patients with psychotic disorders (n=79, pre-treatment symptom severity was not associated with therapist-rated credibility or expected burden of that specific treatment. Treatment outcome had no influence on patient-specific credibility or burden expectancies of therapists. Conclusions: These findings support the notion that specialized training, including practical training with supervision, has long-term positive effects on therapists’ credibility, burden, and harm beliefs concerning trauma-focused treatment.

  8. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  9. Price Changes, Resource Adjustments and Rational Expectations

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    This study investigates the relationship between the accuracy of managerial demand expectations, resource adjustment decisions and selling price changes. In line with rational expectation theory, it is argued that managers adjust resources and selling prices differently in response to expected...... that cost elasticity is higher when a demand decrease is expected among companies with similar exposure to demand uncertainty. Overall, this implies that managerial competences in predicting future demand significantly determines firms’ profitability; especially when demand uncertainty is high...

  10. Efficient heuristics for maximum common substructure search.

    Science.gov (United States)

    Englert, Péter; Kovács, Péter

    2015-05-26

    Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.

  11. Distorted Expectancy Coding in Problem Gambling: Is the Addictive in the Anticipation?

    NARCIS (Netherlands)

    van Holst, Ruth J.; Veltman, Dick J.; Büchel, Christian; van den Brink, Wim; Goudriaan, Anna E.

    2012-01-01

    Background: Pathologic gamblers are known to have abnormal neural responses associated with experiencing monetary wins and losses. However, neural responsiveness during reward and loss expectations in pathologic gamblers has not yet been investigated. Methods: We used a functional magnetic resonance

  12. Health Professionals' Expectations Versus Experiences of Internet-Based Telemonitoring : Survey Among Heart Failure Clinics

    NARCIS (Netherlands)

    de Vries, Arjen E.; van der Wal, Martje H. L.; Nieuwenhuis, Maurice M. W.; de Jong, Richard M.; van Dijk, Rene B.; Jaarsma, Tiny; Hillege, Hans L.

    Background: Although telemonitoring is increasingly used in heart failure care, data on expectations, experiences, and organizational implications concerning telemonitoring are rarely addressed, and the optimal profile of patients who can benefit from telemonitoring has yet to be defined. Objective:

  13. Background radiation map of Thailand

    International Nuclear Information System (INIS)

    Angsuwathana, P.; Chotikanatis, P.

    1997-01-01

    The radioelement concentration in the natural environment as well as the radiation exposure to man in day-to-day life is now the most interesting topic. The natural radiation is frequently referred as a standard for comparing additional sources of man-made radiation such as atomic weapon fallout, nuclear power generation, radioactive waste disposal, etc. The Department of Mineral Resources commenced a five-year project of nationwide airborne geophysical survey by awarding to Kenting Earth Sciences International Limited in 1984. The original purpose of survey was to support mineral exploration and geological mapping. Subsequently, the data quantity has been proved to be suitable for natural radiation information. In 1993 the Department of Mineral Resources, with the assistance of IAEA, published a Background Radiation Map of Thailand at the scale of 1:1,000,000 from the existing airborne radiometric digital data. The production of Background Radiation Map of Thailand is the result of data compilation and correction procedure developed over the Canadian Shield. This end product will be used as a base map in environmental application not only for Thailand but also Southeast Asia region. (author)

  14. Optical polarization: background and camouflage

    Science.gov (United States)

    Škerlind, Christina; Hallberg, Tomas; Eriksson, Johan; Kariis, Hans; Bergström, David

    2017-10-01

    Polarimetric imaging sensors in the electro-optical region, already military and commercially available in both the visual and infrared, show enhanced capabilities for advanced target detection and recognition. The capabilities arise due to the ability to discriminate between man-made and natural background surfaces using the polarization information of light. In the development of materials for signature management in the visible and infrared wavelength regions, different criteria need to be met to fulfil the requirements for a good camouflage against modern sensors. In conventional camouflage design, the aimed design of the surface properties of an object is to spectrally match or adapt it to a background and thereby minimizing the contrast given by a specific threat sensor. Examples will be shown from measurements of some relevant materials and how they in different ways affect the polarimetric signature. Dimensioning properties relevant in an optical camouflage from a polarimetric perspective, such as degree of polarization, the viewing or incident angle, and amount of diffuse reflection, mainly in the infrared region, will be discussed.

  15. The Cosmic Microwave Background Anisotropy

    Science.gov (United States)

    Bennett, C. L.

    1994-12-01

    The properties of the cosmic microwave background radiation provide unique constraints on the history and evolution of the universe. The first detection of anisotropy of the microwave radiation was reported by the COBE Team in 1992, based on the first year of flight data. The latest analyses of the first two years of COBE data are reviewed in this talk, including the amplitude of the microwave anisotropy as a function of angular scale and the statistical nature of the fluctuations. The two-year results are generally consistent with the earlier first year results, but the additional data allow for a better determination of the key cosmological parameters. In this talk the COBE results are compared with other observational anisotropy results and directions for future cosmic microwave anisotropy observations will be discussed. The National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC) is responsible for the design, development, and operation of the Cosmic Background Explorer (COBE). Scientific guidance is provided by the COBE Science Working Group.

  16. Expectancies as core features of mental disorders.

    Science.gov (United States)

    Rief, Winfried; Glombiewski, Julia A; Gollwitzer, Mario; Schubö, Anna; Schwarting, Rainer; Thorwart, Anna

    2015-09-01

    Expectancies are core features of mental disorders, and change in expectations is therefore one of the core mechanisms of treatment in psychiatry. We aim to improve our understanding of expectancies by summarizing factors that contribute to their development, persistence, and modification. We pay particular attention to the issue of persistence of expectancies despite experiences that contradict them. Based on recent research findings, we propose a new model for expectation persistence and expectation change. When expectations are established, effects are evident in neural and other biological systems, for example, via anticipatory reactions, different biological reactions to expected versus unexpected stimuli, etc. Psychological 'immunization' and 'assimilation', implicit self-confirming processes, and stability of biological processes help us to better understand why expectancies persist even in the presence of expectation violations. Learning theory, attentional processes, social influences, and biological determinants contribute to the development, persistence, and modification of expectancies. Psychological interventions should focus on optimizing expectation violation to achieve optimal treatment outcome and to avoid treatment failures.

  17. Expectancies as a Determinant of Interference Phenomena

    Science.gov (United States)

    Hasher, Lynn; Greenberg, Michael

    1977-01-01

    One version, by Lockhart, Craik, and Jacoby, of a levels-of-processing model of memory asserts the importance of the role of expectancies about forthcoming information in determining the elaborateness of a memory trace. Confirmed expectancies result in less-elaborated memory traces; disconfirmed expectancies result in elaborate memory traces.…

  18. Measuring Risk When Expected Losses Are Unbounded

    Directory of Open Access Journals (Sweden)

    Alejandro Balbás

    2014-09-01

    Full Text Available This paper proposes a new method to introduce coherent risk measures for risks with infinite expectation, such as those characterized by some Pareto distributions. Extensions of the conditional value at risk, the weighted conditional value at risk and other examples are given. Actuarial applications are analyzed, such as extensions of the expected value premium principle when expected losses are unbounded.

  19. Parental Expectations of Their Adolescents' Teachers.

    Science.gov (United States)

    Tatar, Moshe; Horenczyk, Gabriel

    2000-01-01

    Examines parental expectations of their children's teachers through use of the Expectations of Teachers questionnaire. Participating parents (N=765) reported greater expectations for help and assistance, followed by teaching competence and fairness on the part of the teacher. Mothers were found to hold higher fairness, help, and assistance…

  20. Role Of Expectancy Manipulation In Systematic Desensitization

    Science.gov (United States)

    Brown, H. Alan

    1973-01-01

    Expectancy, relaxation, and hierarchy content were manipulated. Findings did not support the hypothesis that expectancy was the only factor in desensitization, but did clarify the role of expectancy vis-a-vis the counterconditioning elements typically discussed in the literature. (Author)

  1. Childbirth expectations and correlates at the final stage of pregnancy in Chinese expectant parents

    Directory of Open Access Journals (Sweden)

    Xian Zhang

    2014-06-01

    Conclusion: This study adds to understanding of the childbirth expectations of Chinese expectant parents. It is suggested that maternity healthcare providers pay close attention to the childbirth expectations of expectant parents, and improve the nursing care service to promote positive childbirth experiences and satisfaction of expectant parents.

  2. Heat kernel expansion in the background field formalism

    CERN Document Server

    Barvinsky, Andrei

    2015-01-01

    Heat kernel expansion and background field formalism represent the combination of two calculational methods within the functional approach to quantum field theory. This approach implies construction of generating functionals for matrix elements and expectation values of physical observables. These are functionals of arbitrary external sources or the mean field of a generic configuration -- the background field. Exact calculation of quantum effects on a generic background is impossible. However, a special integral (proper time) representation for the Green's function of the wave operator -- the propagator of the theory -- and its expansion in the ultraviolet and infrared limits of respectively short and late proper time parameter allow one to construct approximations which are valid on generic background fields. Current progress of quantum field theory, its renormalization properties, model building in unification of fundamental physical interactions and QFT applications in high energy physics, gravitation and...

  3. Resolution and Efficiency of Monitored Drift-Tube Chambers with Final Read-out Electronics at High Background Rates

    CERN Document Server

    Dubbert, J; Kortner, O; Kroha, H; Manz, A; Mohrdieck-Möck, S; Rauscher, F; Richter, R; Staude, A; Stiller, W

    2003-01-01

    The performance of a monitored drift-tube chamber for ATLAS with the final read-out electronics was tested at the Gamma Irradiation facility at CERN under varyin photon irradiation rates of up to 990~Hz\\,cm$^{-2}$ which corresponds to 10 times the highest background rate expected in ATLAS. The signal pulse-height measurement of the final read-out electronics was used to perform time-slewing corrections. The corrections improve the average single-tube resolution from 106~$\\mu$m to 89~$\\mu$m at the nominal discriminator threshold of 44~mV without irradiation, and from 114~$\\mu$m to 89~$\\mu$m at the maximum nominal irradiation rate in ATLAS of 100~Hz\\,cm$^{-2}$. The reduction of the threshold from 44~mV to 34~mV and the time-slewing corrections lead to an average single-tube resolution of 82~$\\mu$m without photon background and of 89~$\\mu$m at 100~Hz\\,cm$^{-2}$. The measured muon detection efficiency agrees with the expectation for the final read-out electronics.

  4. Hydraulic Limits on Maximum Plant Transpiration

    Science.gov (United States)

    Manzoni, S.; Vico, G.; Katul, G. G.; Palmroth, S.; Jackson, R. B.; Porporato, A. M.

    2011-12-01

    Photosynthesis occurs at the expense of water losses through transpiration. As a consequence of this basic carbon-water interaction at the leaf level, plant growth and ecosystem carbon exchanges are tightly coupled to transpiration. In this contribution, the hydraulic constraints that limit transpiration rates under well-watered conditions are examined across plant functional types and climates. The potential water flow through plants is proportional to both xylem hydraulic conductivity (which depends on plant carbon economy) and the difference in water potential between the soil and the atmosphere (the driving force that pulls water from the soil). Differently from previous works, we study how this potential flux changes with the amplitude of the driving force (i.e., we focus on xylem properties and not on stomatal regulation). Xylem hydraulic conductivity decreases as the driving force increases due to cavitation of the tissues. As a result of this negative feedback, more negative leaf (and xylem) water potentials would provide a stronger driving force for water transport, while at the same time limiting xylem hydraulic conductivity due to cavitation. Here, the leaf water potential value that allows an optimum balance between driving force and xylem conductivity is quantified, thus defining the maximum transpiration rate that can be sustained by the soil-to-leaf hydraulic system. To apply the proposed framework at the global scale, a novel database of xylem conductivity and cavitation vulnerability across plant types and biomes is developed. Conductivity and water potential at 50% cavitation are shown to be complementary (in particular between angiosperms and conifers), suggesting a tradeoff between transport efficiency and hydraulic safety. Plants from warmer and drier biomes tend to achieve larger maximum transpiration than plants growing in environments with lower atmospheric water demand. The predicted maximum transpiration and the corresponding leaf water

  5. Analogue of Pontryagin's maximum principle for multiple integrals minimization problems

    OpenAIRE

    Mikhail, Zelikin

    2016-01-01

    The theorem like Pontryagin's maximum principle for multiple integrals is proved. Unlike the usual maximum principle, the maximum should be taken not over all matrices, but only on matrices of rank one. Examples are given.

  6. Lake Basin Fetch and Maximum Length/Width

    Data.gov (United States)

    Minnesota Department of Natural Resources — Linear features representing the Fetch, Maximum Length and Maximum Width of a lake basin. Fetch, maximum length and average width are calcuated from the lake polygon...

  7. Postpartum consultation: Occurrence, requirements and expectations

    Directory of Open Access Journals (Sweden)

    Carlgren Ingrid

    2008-07-01

    Full Text Available Abstract Background As a matter of routine, midwives in Sweden have spoken with women about their experiences of labour in a so-called 'postpartum consultation'. However, the possibility of offering women this kind of consultation today is reduced due to shortage of both time and resources. The aim of this study was to explore the occurrence, women's requirements of, and experiences of a postpartum consultation, and to identify expectations from women who wanted but did not have a consultation with the midwife assisting during labour. Methods All Swedish speaking women who gave birth to a live born child at a University Hospital in western Sweden were consecutively included for a phone interview over a three-week period. An additional phone interview was conducted with the women who did not have a postpartum consultation, but who wanted to talk with the midwife assisting during labour. Data from the interviews were analysed using qualitative content analysis. Results Of the 150 interviewed women, 56% (n = 84 had a postpartum consultation of which 61.9% (n = 52 had this with the midwife assisting during labour. Twenty of the 28 women who did not have a consultation with anyone still desired to talk with the midwife assisting during labour. Of these, 19 were interviewed. The content the women wanted to talk about was summarized in four categories: to understand the course of events during labour; to put into words, feelings about undignified management; to describe own behaviour and feelings, and to describe own fear. Conclusion The survey shows that the frequency of postpartum consultation is decreasing, that the majority of women who give birth today still require it, but only about half of them receive it. It is crucial to develop a plan for these consultations that meets both the women's needs and the organization within current maternity care.

  8. In the grip of betrayed expectations

    Directory of Open Access Journals (Sweden)

    Jarić Isidora

    2005-01-01

    Full Text Available The paper discusses the position of young people in Serbia today, as can be inferred from the evidence collected in the study "Politics and everyday life - three years later". Starting from the typology she developed in her 2002 analysis of young people’s interviews (when four basic ways of self-positioning within the social context were identified: "B92 generation", "provincials", "fundamentalists", and "guests", the author traces the changes that have intervened over the past three years in the attitudes of these same respondents concerning politics, personal engagement, views of the future and of their own selves. The fact that the expectations, awakened by the events of 5 October 2000, have been betrayed, has brought strong disappointment, and it is the context in which young people in Serbia once again are losing faith that they will ever find their place in their own society. Against the background of a basic tension in relation to politics - between excessive interest and disgust - there basic strategies of young people in 2005 are formed. "Withdrawal", as the most common strategy, indicates a return of the young to their narrow personal, private, imaginary world, after a short exit into reality and active participation in creating the conditions of their own social existence. The increasingly frequent strategy of "aggression and imposition of one’s own worldview" points to the rising radicalization of the young generation. Finally, it is only the "planning strategy", espoused by just a handful of respondents, that retains traces of faith in future improvement of social conditions.

  9. Expectancy-Value Theory of Achievement Motivation.

    Science.gov (United States)

    Wigfield; Eccles

    2000-01-01

    We discuss the expectancy-value theory of motivation, focusing on an expectancy-value model developed and researched by Eccles, Wigfield, and their colleagues. Definitions of crucial constructs in the model, including ability beliefs, expectancies for success, and the components of subjective task values, are provided. These definitions are compared to those of related constructs, including self-efficacy, intrinsic and extrinsic motivation, and interest. Research is reviewed dealing with two issues: (1) change in children's and adolescents' ability beliefs, expectancies for success, and subjective values, and (2) relations of children's and adolescents' ability-expectancy beliefs and subjective task values to their performance and choice of activities. Copyright 2000 Academic Press.

  10. Low background aspects of GERDA

    International Nuclear Information System (INIS)

    Simgen, Hardy

    2011-01-01

    The GERDA experiment operates bare Germanium diodes enriched in 76 Ge in an environment of pure liquid argon to search for neutrinoless double beta decay. A very low radioactive background is essential for the success of the experiment. We present here the research done in order to remove radio-impurities coming from the liquid argon, the stainless steel cryostat and the front-end electronics. We found that liquid argon can be purified efficiently from 222 Rn. The main source of 222 Rn in GERDA is the cryostat which emanates about 55 mBq. A thin copper shroud in the center of the cryostat was implemented to prevent radon from approaching the diodes. Gamma ray screening of radio-pure components for front-end electronics resulted in the development of a pre-amplifier with a total activity of less than 1 mBq 228 Th.

  11. The cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Wilson, R.W.

    1980-01-01

    The history is described of the discovery of microwave radiation of the cosmic background using the 20-foot horn antenna at the Bell Laboratories back in 1965. Ruby masers with travelling wave were used, featuring the lowest noise in the world. The measurement proceeded on 7 cm. In measuring microwave radiation from the regions outside the Milky Way continuous noise was discovered whose temperature exceeded the calculated contributions of the individual detection system elements by 3 K. A comparison with the theory showed that relict radiation from the Big Bang period was the source of the noise. The discovery was verified by measurements on the 20.1 cm wavelength and by other authors' measurements on 0.5 mm to 74 cm, and by optical measurements of the interstellar molecule spectrum. (Ha)

  12. Polarization of Cosmic Microwave Background

    International Nuclear Information System (INIS)

    Buzzelli, A; Cabella, P; De Gasperis, G; Vittorio, N

    2016-01-01

    In this work we present an extension of the ROMA map-making code for data analysis of Cosmic Microwave Background polarization, with particular attention given to the inflationary polarization B-modes. The new algorithm takes into account a possible cross- correlated noise component among the different detectors of a CMB experiment. We tested the code on the observational data of the BOOMERanG (2003) experiment and we show that we are provided with a better estimate of the power spectra, in particular the error bars of the BB spectrum are smaller up to 20% for low multipoles. We point out the general validity of the new method. A possible future application is the LSPE balloon experiment, devoted to the observation of polarization at large angular scales. (paper)

  13. A New Measurement of the Cosmic X-ray Background

    International Nuclear Information System (INIS)

    Moretti, A.

    2009-01-01

    I present a new analytical description of the cosmic X-ray background (CXRB) spectrum in the 1.5-200 keV energy band, obtained by combining the new measurement performed by the Swift X-ray telescope (XRT) with the recently published Swift burst alert telescope (BAT) measurement. A study of the cosmic variance in the XRT band (1.5-7 keV) is also presented. I find that the expected cosmic variance (expected from LogN-LogS) scales as Ω -0.3 (where Ω is the surveyed area) in very good agreement with XRT data.

  14. Quantum field theory in a gravitational shock wave background

    International Nuclear Information System (INIS)

    Klimcik, C.

    1988-01-01

    A scalar massless non-interacting quantum field theory on an arbitrary gravitational shock wave background is exactly solved. S-matrix and expectation values of the energy-momentum tensor are computed for an arbitrarily polarized sourceless gravitational shock wave and for a homogeneous infinite planar shell shock wave, all performed in any number of space-time dimensions. Expectation values of the energy density in scattering states exhibit a singularity which lies exactly at the location of the curvature singularity found in the infinite shell collision. (orig.)

  15. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  16. Maximum Profit Configurations of Commercial Engines

    Directory of Open Access Journals (Sweden)

    Yiran Chen

    2011-06-01

    Full Text Available An investigation of commercial engines with finite capacity low- and high-price economic subsystems and a generalized commodity transfer law [n ∝ Δ (P m] in commodity flow processes, in which effects of the price elasticities of supply and demand are introduced, is presented in this paper. Optimal cycle configurations of commercial engines for maximum profit are obtained by applying optimal control theory. In some special cases, the eventual state—market equilibrium—is solely determined by the initial conditions and the inherent characteristics of two subsystems; while the different ways of transfer affect the model in respects of the specific forms of the paths of prices and the instantaneous commodity flow, i.e., the optimal configuration.

  17. The worst case complexity of maximum parsimony.

    Science.gov (United States)

    Carmel, Amir; Musa-Lempel, Noa; Tsur, Dekel; Ziv-Ukelson, Michal

    2014-11-01

    One of the core classical problems in computational biology is that of constructing the most parsimonious phylogenetic tree interpreting an input set of sequences from the genomes of evolutionarily related organisms. We reexamine the classical maximum parsimony (MP) optimization problem for the general (asymmetric) scoring matrix case, where rooted phylogenies are implied, and analyze the worst case bounds of three approaches to MP: The approach of Cavalli-Sforza and Edwards, the approach of Hendy and Penny, and a new agglomerative, "bottom-up" approach we present in this article. We show that the second and third approaches are faster than the first one by a factor of Θ(√n) and Θ(n), respectively, where n is the number of species.

  18. Citizen Expectations and Satisfaction Over Time

    DEFF Research Database (Denmark)

    Hjortskov, Morten

    2018-01-01

    Expectations are thought to affect how citizens form their attitudes and behavior toward public services. Such attitudes may include citizen satisfaction, where expectations play a fundamental role, and relevant behaviors include choice of services and the decision to voice opinions about them....... However, there are few investigations into what drives citizen expectations and even fewer that consider these relationships across time. This article tests whether prior expectations, perceived performance, and citizen satisfaction influence future expectations, using a unique dataset that follows...... individual citizens across two subsequent school satisfaction surveys from 2011 and 2013. The results show that prior expectations have a large and consistent influence on future expectations, as predicted by the literature, whereas the influence from prior perceived performance seems less consistent. Prior...

  19. Marital Expectations in Strong African American Marriages.

    Science.gov (United States)

    Vaterlaus, J Mitchell; Skogrand, Linda; Chaney, Cassandra; Gahagan, Kassandra

    2017-12-01

    The current exploratory study utilized a family strengths framework to identify marital expectations in 39 strong African American heterosexual marriages. Couples reflected on their marital expectations over their 10 or more years of marriage. Three themes emerged through qualitative analysis and the participants' own words were used in the presentation of the themes. African Americans indicated that there was growth in marital expectations over time, with marital expectations often beginning with unrealistic expectations that grew into more realistic expectations as their marriages progressed. Participants also indicated that core expectations in strong African American marriages included open communication, congruent values, and positive treatment of spouse. Finally, participants explained there is an "I" in marriage as they discussed the importance of autonomy within their marital relationships. Results are discussed in association with existing research and theory. © 2016 Family Process Institute.

  20. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  1. Mothers' Maximum Drinks Ever Consumed in 24 Hours Predicts Mental Health Problems in Adolescent Offspring

    Science.gov (United States)

    Malone, Stephen M.; McGue, Matt; Iacono, William G.

    2010-01-01

    Background: The maximum number of alcoholic drinks consumed in a single 24-hr period is an alcoholism-related phenotype with both face and empirical validity. It has been associated with severity of withdrawal symptoms and sensitivity to alcohol, genes implicated in alcohol metabolism, and amplitude of a measure of brain activity associated with…

  2. Background radiation and childhood cancer mortality

    International Nuclear Information System (INIS)

    Sakka, Masatoshi

    1979-01-01

    Oxford Survey of Childhood Cancer estimated an ''extra'' cancer risk of 572 per million man-rad of juvenile cancer deaths under 10 years of age. In Hiroshima and Nagasaki 36.9 juvenile cancers were expected out of 64,490 man-rad of exposed mothers. Observed cancer was, however, only one. The discrepancy was explained partly by possible overlapping of confidence intervals of two samples and partly by excessive doses received by exposed fetuses in Japan. If A-bomb radiation sterilized preleukemic cells induced in fetuses, it must also killed those cells in irradiated adults. Leukemogenic efficiency in adults, about 2.10 -5 per rad, is not different either in A-bomb survivors or in irradiated patients. We examined a dose-effect relationship in childhood cancer mortality (0 - 4 yrs) in Miyagi Prefecture Japan. Ninety two cancers were detected out of 1,214,157 children from 1968 to 1975. They were allocated to 8 districts with different background levels. Population at risk was calculated every year for every district. About 4 deaths occurred every 10,000 man-rad, which is comparable with 572 per million man-rad in Oxford Survey. One out of one thousand infants died from severe malformation in every year when they received 9.8 rad in embryonic stage, the doubling dose is estimated as 20 rad. Clinical and biological significance of the statistical data must be examined in future. Fetal death decreased significantly from 110/1,000 in 1962 to 55/1,000 in 1975. Background radiation plays no role in fetal death in Miyagi Prefecture. (author)

  3. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  4. Improving the implementation of tailored expectant management in subfertile couples: protocol for a cluster randomized trial

    NARCIS (Netherlands)

    Boogaard, N.M. van den; Kersten, F.A.M.; Goddijn, M.; Bossuyt, P.M.; Veen, F. van der; Hompes, P.G.; Hermens, R.P.M.G.; Braat, D.D.M.; Mol, B.W.; Nelen, W.L.D.M.; et al.,

    2013-01-01

    BACKGROUND: Prognostic models in reproductive medicine can help to identify subfertile couples who would benefit from fertility treatment. Expectant management in couples with a good chance of natural conception, i.e., tailored expectant management (TEM), prevents unnecessary treatment and is

  5. Improving the implementation of tailored expectant management in subfertile couples : protocol for a cluster randomized trial

    NARCIS (Netherlands)

    van den Boogaard, Noortje M; Kersten, Fleur A M; Goddijn, Mariëtte; Bossuyt, Patrick M M; van der Veen, Fulco; Hompes, Peter G A; Hermens, Rosella P M G; Braat, Didi D M; Mol, Ben Willem J; Nelen, Willianne L D M; Hoek, Annemieke

    2013-01-01

    BACKGROUND: Prognostic models in reproductive medicine can help to identify subfertile couples who would benefit from fertility treatment. Expectant management in couples with a good chance of natural conception, i.e., tailored expectant management (TEM), prevents unnecessary treatment and is

  6. The Role of Client Expectancies in Counseling: The Research and Theory of Bandura and Tinsley.

    Science.gov (United States)

    Thiessen, Sarah H.

    Increasing evidence supports the idea that client expectancies have a large impact on counseling relationships, processes, and outcomes. Research and theories regarding expectancies are examined in this paper. Albert Bandura's theory of self-efficacy is discussed first to provide a background for understanding the significance of efficacy…

  7. Obesity in adulthood and its consequences for life expectancy: a life-table analysis

    NARCIS (Netherlands)

    A. Peeters (Anna); J.J.M. Barendregt (Jan); F. Willekens; J.P. Mackenbach (Johan); A. Al Mamun; L.G.A. Bonneux (Luc)

    2003-01-01

    textabstractBACKGROUND: Overweight and obesity in adulthood are linked to an increased risk for death and disease. Their potential effect on life expectancy and premature death has not yet been described. OBJECTIVE: To analyze reductions in life expectancy and increases in

  8. Game Day Alcohol Expectancies among College Students from a University in the Southeast

    Science.gov (United States)

    Glassman, Tavis; Miller, Jeff; Miller, E. Maureen; Wohlwend, Jennifer; Reindl, Diana

    2012-01-01

    Background: The alcohol consumption associated with college sporting events depicts a public health challenge. Purpose: The aim of this investigation involved assessing the alcohol expectancies among college students associated with home football games and which of these expectancies was most predictive of high-risk drinking. Methods: Researchers…

  9. Nonrelativistic trace and diffeomorphism anomalies in particle number background

    Science.gov (United States)

    Auzzi, Roberto; Baiguera, Stefano; Nardelli, Giuseppe

    2018-04-01

    Using the heat kernel method, we compute nonrelativistic trace anomalies for Schrödinger theories in flat spacetime, with a generic background gauge field for the particle number symmetry, both for a free scalar and a free fermion. The result is genuinely nonrelativistic, and it has no counterpart in the relativistic case. Contrary to naive expectations, the anomaly is not gauge invariant; this is similar to the nongauge covariance of the non-Abelian relativistic anomaly. We also show that, in the same background, the gravitational anomaly for a nonrelativistic scalar vanishes.

  10. Background suppression in Gerda Phase II and its study in the LArGe low background set-up

    Energy Technology Data Exchange (ETDEWEB)

    Budjas, Dusan [Physik-Department E15, Technische Universitaet Muenchen (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    In Phase II of the Gerda experiment additional ∝20 kg of BEGe-type germanium detectors, enriched in {sup 76}Ge, will be deployed in liquid argon (LAr) to further increase the sensitivity for the half-life of neutrinoless double beta (0νββ) decay of {sup 76}Ge to > 2 . 10{sup 26} yr. To reduce background by a factor of 10 to the required level of < 10{sup -3} cts/(keV.kg.yr), it is necessary to employ active background-suppression techniques, including anti-Compton veto using scintillation light detection from LAr and pulse shape discrimination exploiting the characteristic electrical field distribution inside BEGe detectors. The latter technique can identify single-site events (typical for 0νββ) and efficiently reject multi-site events (mainly from γ-rays), as well as different types of background events from detector surfaces. The combined power of these techniques was studied for {sup 42}K and other background sources at the low background facility LArGe. Together with extensive simulations, the information from tracking of the Phase II detector material exposure to cosmic rays and based on the background contributions observed in Phase I, the expected background level in Phase II in the region of interest at 2039 keV, the Q{sub ββ} energy of {sup 76}Ge, is estimated. The preliminary analysis shows that contributions from all expected background components after all cuts are in line with the goal of Gerda Phase II.

  11. A maximum power point tracking for photovoltaic-SPE system using a maximum current controller

    Energy Technology Data Exchange (ETDEWEB)

    Muhida, Riza [Osaka Univ., Dept. of Physical Science, Toyonaka, Osaka (Japan); Osaka Univ., Dept. of Electrical Engineering, Suita, Osaka (Japan); Park, Minwon; Dakkak, Mohammed; Matsuura, Kenji [Osaka Univ., Dept. of Electrical Engineering, Suita, Osaka (Japan); Tsuyoshi, Akira; Michira, Masakazu [Kobe City College of Technology, Nishi-ku, Kobe (Japan)

    2003-02-01

    Processes to produce hydrogen from solar photovoltaic (PV)-powered water electrolysis using solid polymer electrolysis (SPE) are reported. An alternative control of maximum power point tracking (MPPT) in the PV-SPE system based on the maximum current searching methods has been designed and implemented. Based on the characteristics of voltage-current and theoretical analysis of SPE, it can be shown that the tracking of the maximum current output of DC-DC converter in SPE side will track the MPPT of photovoltaic panel simultaneously. This method uses a proportional integrator controller to control the duty factor of DC-DC converter with pulse-width modulator (PWM). The MPPT performance and hydrogen production performance of this method have been evaluated and discussed based on the results of the experiment. (Author)

  12. Patient Awareness and Expectations of Pharmacist Services During Hospital Stay.

    Science.gov (United States)

    King, Philip K; Martin, Steven J; Betka, Eric M

    2017-10-01

    There are insufficient data in the United States regarding patient awareness and expectations of hospital pharmacist availability and services. The objective of this research is to assess patient awareness and expectations of hospital pharmacist services and to determine whether a marketing campaign for pharmacist services increases patient awareness and expectations. Eligible inpatients were surveyed before and after implementation of a hospital-wide pharmacist services marketing campaign (12 items; Likert scale of 1 [strongly disagree] to 4 [strongly agree]; maximum total score of 48) regarding awareness of pharmacist services. The primary outcome was the change in median total survey scores from baseline. Other outcomes included the frequency of patient requests for pharmacists. Similar numbers of patients completed the survey before and after the campaign (intervention, n = 140, vs control, n = 147). Awareness of pharmacist availability and services was increased (41 [interquartile ranges, IQRs: 36-46] vs 37 [IQR 31-43]; P marketing campaign implementation. Awareness among inpatients of pharmacist services is low. Marketing pharmacist availability and services to patients in the hospital improves awareness and expectations for pharmacist-provided care and increases the frequency of patient-initiated interaction between pharmacists and patients. This could improve patient outcomes as pharmacists become more integrally involved in direct patient care.

  13. Plenoptic background oriented schlieren imaging

    International Nuclear Information System (INIS)

    Klemkowsky, Jenna N; Fahringer, Timothy W; Clifford, Christopher J; Thurow, Brian S; Bathel, Brett F

    2017-01-01

    The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields. (paper)

  14. Natural background radiation in Jordan

    International Nuclear Information System (INIS)

    Daoud, M.N.S.

    1997-01-01

    An Airborne Gamma Ray survey has been accomplished for Jordan since 1979. A complete report has been submitted to the Natural Resources Authority along with field and processed data ''digital and analogue''. Natural radioelements concentration is not provided with this report. From the corrected count rate data for each natural radioelement, Concentrations and exposure rates at the ground level were calculated. Contoured maps, showing the exposure rates and the dose rates were created. Both maps reflect the surface geology of Jordan, where the Phosphate areas are very well delineated by high-level contours. In southeastern Jordan the Ordovician sandstone, which contain high percentage of Th (around 2000 ppm in some places) and a moderate percentage of U (about 300 ppm), also show high gamma radiation exposures compared with the surrounding areas. Comparing the values of the exposure rates given in (μR/h) to those obtained from other countries such as United States, Canada, Germany, etc. Jordan shows higher background radiation which reach two folds and even more than those in these countries. More detailed studies should be performed in order to evaluate the radiological risk limits on people who are living in areas of high radiation such that the area of the phosphatic belt which covers a vast area of Jordan high Plateau. (author)

  15. Natural background radiation in Jordan

    Energy Technology Data Exchange (ETDEWEB)

    Daoud, M N.S. [National Resources Authority, Ministry of Energy and Mineral Resources, Amman (Jordan)

    1997-11-01

    An Airborne Gamma Ray survey has been accomplished for Jordan since 1979. A complete report has been submitted to the Natural Resources Authority along with field and processed data ``digital and analogue``. Natural radioelements concentration is not provided with this report. From the corrected count rate data for each natural radioelement, Concentrations and exposure rates at the ground level were calculated. Contoured maps, showing the exposure rates and the dose rates were created. Both maps reflect the surface geology of Jordan, where the Phosphate areas are very well delineated by high-level contours. In southeastern Jordan the Ordovician sandstone, which contain high percentage of Th (around 2000 ppm in some places) and a moderate percentage of U (about 300 ppm), also show high gamma radiation exposures compared with the surrounding areas. Comparing the values of the exposure rates given in ({mu}R/h) to those obtained from other countries such as United States, Canada, Germany, etc. Jordan shows higher background radiation which reach two folds and even more than those in these countries. More detailed studies should be performed in order to evaluate the radiological risk limits on people who are living in areas of high radiation such that the area of the phosphatic belt which covers a vast area of Jordan high Plateau. (author). 8 refs, 10 figs, 7 tabs.

  16. The AAVSO 2011 Demographic and Background Survey

    Science.gov (United States)

    Price, A.

    2012-04-01

    In 2011, the AAVSO conducted a survey of 615 people who are or were recently active in the organization. The survey included questions about their demographic background and variable star interests. Data are descriptively analyzed and compared with prior surveys. Results show an organization of very highly educated, largely male amateur and professional astronomers distributed across 108 countries. Participants tend to be loyal, with the average time of involvement in the AAVSO reported as 14 years. Most major demographic factors have not changed much over time. However, the average age of new members is increasing. Also, a significant portion of the respondents report being strictly active in a non-observing capacity, reflecting the growing mission of the organization. Motivations of participants are more aligned with scientific contribution than with that reported by other citizen science projects. This may help explain why a third of all respondents are an author or co-author of a paper in an astronomical journal. Finally, there is some evidence that participation in the AAVSO has a greater impact on the respondents' view of their role in astronomy compared to that expected through increasing amateur astronomy experience alone.

  17. Expectations for recovery important in the prognosis of whiplash injuries.

    Directory of Open Access Journals (Sweden)

    Lena W Holm

    2008-05-01

    Full Text Available BACKGROUND: Individuals' expectations on returning to work after an injury have been shown to predict the duration of time that a person with work-related low back pain will remain on benefits; individuals with lower recovery expectations received benefits for a longer time than those with higher expectations. The role of expectations in recovery from traumatic neck pain, in particular whiplash-associated disorders (WAD, has not been assessed to date to our knowledge. The aim of this study was to investigate if expectations for recovery are a prognostic factor after experiencing a WAD. METHODS AND FINDINGS: We used a prospective cohort study composed of insurance claimants in Sweden. The participants were car occupants who filed a neck injury claim (i.e., for WAD to one of two insurance companies between 15 January 2004 and 12 January 2005 (n = 1,032. Postal questionnaires were completed shortly (average 23 d after the collision and then again 6 mo later. Expectations for recovery were measured with a numerical rating scale (NRS at baseline, where 0 corresponds to "unlikely to make a full recovery" and 10 to "very likely to make a full recovery." The scale was reverse coded and trichotomised into NRS 0, 1-4, and 5-10. The main outcome measure was self-perceived disability at 6 mo postinjury, measured with the Pain Disability Index, and categorised into no/low, moderate, and high disability. Multivariable polytomous logistic regression was used for the analysis. There was a dose response relationship between recovery expectations and disability. After controlling for severity of physical and mental symptoms, individuals who stated that they were less likely to make a full recovery (NRS 5-10, were more likely to have a high disability compared to individuals who stated that they were very likely to make a full recovery (odds ratio [OR] 4.2 [95% confidence interval (CI 2.1 to 8.5]. For the intermediate category (NRS 1-4, the OR was 2.1 (95% CI 1

  18. Expectations for Recovery Important in the Prognosis of Whiplash Injuries

    Science.gov (United States)

    Holm, Lena W; Carroll, Linda J; Cassidy, J. David; Skillgate, Eva; Ahlbom, Anders

    2008-01-01

    Background Individuals' expectations on returning to work after an injury have been shown to predict the duration of time that a person with work-related low back pain will remain on benefits; individuals with lower recovery expectations received benefits for a longer time than those with higher expectations. The role of expectations in recovery from traumatic neck pain, in particular whiplash-associated disorders (WAD), has not been assessed to date to our knowledge. The aim of this study was to investigate if expectations for recovery are a prognostic factor after experiencing a WAD. Methods and Findings We used a prospective cohort study composed of insurance claimants in Sweden. The participants were car occupants who filed a neck injury claim (i.e., for WAD) to one of two insurance companies between 15 January 2004 and 12 January 2005 (n = 1,032). Postal questionnaires were completed shortly (average 23 d) after the collision and then again 6 mo later. Expectations for recovery were measured with a numerical rating scale (NRS) at baseline, where 0 corresponds to “unlikely to make a full recovery” and 10 to “very likely to make a full recovery.” The scale was reverse coded and trichotomised into NRS 0, 1–4, and 5–10. The main outcome measure was self-perceived disability at 6 mo postinjury, measured with the Pain Disability Index, and categorised into no/low, moderate, and high disability. Multivariable polytomous logistic regression was used for the analysis. There was a dose response relationship between recovery expectations and disability. After controlling for severity of physical and mental symptoms, individuals who stated that they were less likely to make a full recovery (NRS 5–10), were more likely to have a high disability compared to individuals who stated that they were very likely to make a full recovery (odds ratio [OR] 4.2 [95% confidence interval (CI) 2.1 to 8.5]. For the intermediate category (NRS 1–4), the OR was 2.1 (95% CI 1

  19. Mental health expectancy--the European perspective

    DEFF Research Database (Denmark)

    Jagger, C; Ritchie, K; Brønnum-Hansen, Henrik

    1998-01-01

    The increase in life expectancy observed over the last decade has particular relevance for mental health conditions of old age, such as dementia. Although mental disorders have been estimated to be responsible for 60% of all disabilities, until recently population health indicators such as health...... expectancies have concentrated on calculating disability-free life expectancy based on physical functioning. In 1994, a European Network for the Calculation of Health Expectancies (Euro-REVES) was established, one of its aims being the development and promotion of mental health expectancies. Such indicators...... may have an important role in monitoring future changes in the mental health of populations and predicting service needs. This article summarizes the proceedings and recommendations of the first European Conference on Mental Health Expectancy....

  20. Stock Market Expectations of Dutch Households.

    Science.gov (United States)

    Hurd, Michael; van Rooij, Maarten; Winter, Joachim

    2011-04-01

    Despite its importance for the analysis of life-cycle behavior and, in particular, retirement planning, stock ownership by private households is poorly understood. Among other approaches to investigate this puzzle, recent research has started to elicit private households' expectations of stock market returns. This paper reports findings from a study that collected data over a two-year period both on households' stock market expectations (subjective probabilities of gains or losses) and on whether they own stocks. We document substantial heterogeneity in financial market expectations. Expectations are correlated with stock ownership. Over the two years of our data, stock market prices increased, and expectations of future stock market price changes also increased, lending support to the view that expectations are influenced by recent stock gains or losses.

  1. Stochastic Dominance under the Nonlinear Expected Utilities

    Directory of Open Access Journals (Sweden)

    Xinling Xiao

    2014-01-01

    Full Text Available In 1947, von Neumann and Morgenstern introduced the well-known expected utility and the related axiomatic system (see von Neumann and Morgenstern (1953. It is widely used in economics, for example, financial economics. But the well-known Allais paradox (see Allais (1979 shows that the linear expected utility has some limitations sometimes. Because of this, Peng proposed a concept of nonlinear expected utility (see Peng (2005. In this paper we propose a concept of stochastic dominance under the nonlinear expected utilities. We give sufficient conditions on which a random choice X stochastically dominates a random choice Y under the nonlinear expected utilities. We also provide sufficient conditions on which a random choice X strictly stochastically dominates a random choice Y under the sublinear expected utilities.

  2. Simultaneous maximum a posteriori longitudinal PET image reconstruction

    Science.gov (United States)

    Ellis, Sam; Reader, Andrew J.

    2017-09-01

    Positron emission tomography (PET) is frequently used to monitor functional changes that occur over extended time scales, for example in longitudinal oncology PET protocols that include routine clinical follow-up scans to assess the efficacy of a course of treatment. In these contexts PET datasets are currently reconstructed into images using single-dataset reconstruction methods. Inspired by recently proposed joint PET-MR reconstruction methods, we propose to reconstruct longitudinal datasets simultaneously by using a joint penalty term in order to exploit the high degree of similarity between longitudinal images. We achieved this by penalising voxel-wise differences between pairs of longitudinal PET images in a one-step-late maximum a posteriori (MAP) fashion, resulting in the MAP simultaneous longitudinal reconstruction (SLR) method. The proposed method reduced reconstruction errors and visually improved images relative to standard maximum likelihood expectation-maximisation (ML-EM) in simulated 2D longitudinal brain tumour scans. In reconstructions of split real 3D data with inserted simulated tumours, noise across images reconstructed with MAP-SLR was reduced to levels equivalent to doubling the number of detected counts when using ML-EM. Furthermore, quantification of tumour activities was largely preserved over a variety of longitudinal tumour changes, including changes in size and activity, with larger changes inducing larger biases relative to standard ML-EM reconstructions. Similar improvements were observed for a range of counts levels, demonstrating the robustness of the method when used with a single penalty strength. The results suggest that longitudinal regularisation is a simple but effective method of improving reconstructed PET images without using resolution degrading priors.

  3. Rotating strings in confining AdS/CFT backgrounds

    International Nuclear Information System (INIS)

    Armoni, Adi; Barbon, Jose L.F.; Petkou, Anastasios C.

    2002-01-01

    We study semiclassical rotating strings in AdS/CFT backgrounds that exhibit both confinement and finite-size effects. The energy versus spin dispersion relation for short strings is the expected Regge trajectory behaviour, with the same string tension as is measured by the Wilson loop. Long strings probe the interplay between confinement and finite-size effects. In particular, the dispersion relation for long strings shows a characteristic dependence on the string tension and the finite-size scale. (author)

  4. Translational invariance and the anisotropy of the cosmic microwave background

    International Nuclear Information System (INIS)

    Carroll, Sean M.; Tseng, C.-Y.; Wise, Mark B.

    2010-01-01

    Primordial quantum fluctuations produced by inflation are conventionally assumed to be statistically homogeneous, a consequence of translational invariance. In this paper we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes lm a l ' m ' *> of the spherical-harmonic coefficients.

  5. Translational invariance and the anisotropy of the cosmic microwave background

    Science.gov (United States)

    Carroll, Sean M.; Tseng, Chien-Yao; Wise, Mark B.

    2010-04-01

    Primordial quantum fluctuations produced by inflation are conventionally assumed to be statistically homogeneous, a consequence of translational invariance. In this paper we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes ⟨almal'm'*⟩ of the spherical-harmonic coefficients.

  6. Maximum mass of magnetic white dwarfs

    International Nuclear Information System (INIS)

    Paret, Daryel Manreza; Horvath, Jorge Ernesto; Martínez, Aurora Perez

    2015-01-01

    We revisit the problem of the maximum masses of magnetized white dwarfs (WDs). The impact of a strong magnetic field on the structure equations is addressed. The pressures become anisotropic due to the presence of the magnetic field and split into parallel and perpendicular components. We first construct stable solutions of the Tolman-Oppenheimer-Volkoff equations for parallel pressures and find that physical solutions vanish for the perpendicular pressure when B ≳ 10 13 G. This fact establishes an upper bound for a magnetic field and the stability of the configurations in the (quasi) spherical approximation. Our findings also indicate that it is not possible to obtain stable magnetized WDs with super-Chandrasekhar masses because the values of the magnetic field needed for them are higher than this bound. To proceed into the anisotropic regime, we can apply results for structure equations appropriate for a cylindrical metric with anisotropic pressures that were derived in our previous work. From the solutions of the structure equations in cylindrical symmetry we have confirmed the same bound for B ∼ 10 13 G, since beyond this value no physical solutions are possible. Our tentative conclusion is that massive WDs with masses well beyond the Chandrasekhar limit do not constitute stable solutions and should not exist. (paper)

  7. TRENDS IN ESTIMATED MIXING DEPTH DAILY MAXIMUMS

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, R; Amy DuPont, A; Robert Kurzeja, R; Matt Parker, M

    2007-11-12

    Mixing depth is an important quantity in the determination of air pollution concentrations. Fireweather forecasts depend strongly on estimates of the mixing depth as a means of determining the altitude and dilution (ventilation rates) of smoke plumes. The Savannah River United States Forest Service (USFS) routinely conducts prescribed fires at the Savannah River Site (SRS), a heavily wooded Department of Energy (DOE) facility located in southwest South Carolina. For many years, the Savannah River National Laboratory (SRNL) has provided forecasts of weather conditions in support of the fire program, including an estimated mixing depth using potential temperature and turbulence change with height at a given location. This paper examines trends in the average estimated mixing depth daily maximum at the SRS over an extended period of time (4.75 years) derived from numerical atmospheric simulations using two versions of the Regional Atmospheric Modeling System (RAMS). This allows for differences to be seen between the model versions, as well as trends on a multi-year time frame. In addition, comparisons of predicted mixing depth for individual days in which special balloon soundings were released are also discussed.

  8. Mammographic image restoration using maximum entropy deconvolution

    International Nuclear Information System (INIS)

    Jannetta, A; Jackson, J C; Kotre, C J; Birch, I P; Robson, K J; Padgett, R

    2004-01-01

    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization

  9. Maximum Margin Clustering of Hyperspectral Data

    Science.gov (United States)

    Niazmardi, S.; Safari, A.; Homayouni, S.

    2013-09-01

    In recent decades, large margin methods such as Support Vector Machines (SVMs) are supposed to be the state-of-the-art of supervised learning methods for classification of hyperspectral data. However, the results of these algorithms mainly depend on the quality and quantity of available training data. To tackle down the problems associated with the training data, the researcher put effort into extending the capability of large margin algorithms for unsupervised learning. One of the recent proposed algorithms is Maximum Margin Clustering (MMC). The MMC is an unsupervised SVMs algorithm that simultaneously estimates both the labels and the hyperplane parameters. Nevertheless, the optimization of the MMC algorithm is a non-convex problem. Most of the existing MMC methods rely on the reformulating and the relaxing of the non-convex optimization problem as semi-definite programs (SDP), which are computationally very expensive and only can handle small data sets. Moreover, most of these algorithms are two-class classification, which cannot be used for classification of remotely sensed data. In this paper, a new MMC algorithm is used that solve the original non-convex problem using Alternative Optimization method. This algorithm is also extended for multi-class classification and its performance is evaluated. The results of the proposed algorithm show that the algorithm has acceptable results for hyperspectral data clustering.

  10. Paving the road to maximum productivity.

    Science.gov (United States)

    Holland, C

    1998-01-01

    "Job security" is an oxymoron in today's environment of downsizing, mergers, and acquisitions. Workers find themselves living by new rules in the workplace that they may not understand. How do we cope? It is the leader's charge to take advantage of this chaos and create conditions under which his or her people can understand the need for change and come together with a shared purpose to effect that change. The clinical laboratory at Arkansas Children's Hospital has taken advantage of this chaos to down-size and to redesign how the work gets done to pave the road to maximum productivity. After initial hourly cutbacks, the workers accepted the cold, hard fact that they would never get their old world back. They set goals to proactively shape their new world through reorganizing, flexing staff with workload, creating a rapid response laboratory, exploiting information technology, and outsourcing. Today the laboratory is a lean, productive machine that accepts change as a way of life. We have learned to adapt, trust, and support each other as we have journeyed together over the rough roads. We are looking forward to paving a new fork in the road to the future.

  11. Maximum power flux of auroral kilometric radiation

    International Nuclear Information System (INIS)

    Benson, R.F.; Fainberg, J.

    1991-01-01

    The maximum auroral kilometric radiation (AKR) power flux observed by distant satellites has been increased by more than a factor of 10 from previously reported values. This increase has been achieved by a new data selection criterion and a new analysis of antenna spin modulated signals received by the radio astronomy instrument on ISEE 3. The method relies on selecting AKR events containing signals in the highest-frequency channel (1980, kHz), followed by a careful analysis that effectively increased the instrumental dynamic range by more than 20 dB by making use of the spacecraft antenna gain diagram during a spacecraft rotation. This analysis has allowed the separation of real signals from those created in the receiver by overloading. Many signals having the appearance of AKR harmonic signals were shown to be of spurious origin. During one event, however, real second harmonic AKR signals were detected even though the spacecraft was at a great distance (17 R E ) from Earth. During another event, when the spacecraft was at the orbital distance of the Moon and on the morning side of Earth, the power flux of fundamental AKR was greater than 3 x 10 -13 W m -2 Hz -1 at 360 kHz normalized to a radial distance r of 25 R E assuming the power falls off as r -2 . A comparison of these intense signal levels with the most intense source region values (obtained by ISIS 1 and Viking) suggests that multiple sources were observed by ISEE 3

  12. Maximum likelihood window for time delay estimation

    International Nuclear Information System (INIS)

    Lee, Young Sup; Yoon, Dong Jin; Kim, Chi Yup

    2004-01-01

    Time delay estimation for the detection of leak location in underground pipelines is critically important. Because the exact leak location depends upon the precision of the time delay between sensor signals due to leak noise and the speed of elastic waves, the research on the estimation of time delay has been one of the key issues in leak lovating with the time arrival difference method. In this study, an optimal Maximum Likelihood window is considered to obtain a better estimation of the time delay. This method has been proved in experiments, which can provide much clearer and more precise peaks in cross-correlation functions of leak signals. The leak location error has been less than 1 % of the distance between sensors, for example the error was not greater than 3 m for 300 m long underground pipelines. Apart from the experiment, an intensive theoretical analysis in terms of signal processing has been described. The improved leak locating with the suggested method is due to the windowing effect in frequency domain, which offers a weighting in significant frequencies.

  13. Ancestral Sequence Reconstruction with Maximum Parsimony.

    Science.gov (United States)

    Herbst, Lina; Fischer, Mareike

    2017-12-01

    One of the main aims in phylogenetics is the estimation of ancestral sequences based on present-day data like, for instance, DNA alignments. One way to estimate the data of the last common ancestor of a given set of species is to first reconstruct a phylogenetic tree with some tree inference method and then to use some method of ancestral state inference based on that tree. One of the best-known methods both for tree inference and for ancestral sequence inference is Maximum Parsimony (MP). In this manuscript, we focus on this method and on ancestral state inference for fully bifurcating trees. In particular, we investigate a conjecture published by Charleston and Steel in 1995 concerning the number of species which need to have a particular state, say a, at a particular site in order for MP to unambiguously return a as an estimate for the state of the last common ancestor. We prove the conjecture for all even numbers of character states, which is the most relevant case in biology. We also show that the conjecture does not hold in general for odd numbers of character states, but also present some positive results for this case.

  14. Expectations, Bond Yields and Monetary Policy

    DEFF Research Database (Denmark)

    Chun, Albert Lee

    2011-01-01

    expectations about inflation, output growth, and the anticipated path of monetary policy actions contain important information for explaining movements in bond yields. Estimates from a forward-looking monetary policy rule suggest that the central bank exhibits a preemptive response to inflationary expectations...... of this type may provide traders and policymakers with a new set of tools for formally assessing the reaction of bond yields to shifts in market expectations...

  15. Heterogeneous inflation expectations, learning, and market outcomes

    OpenAIRE

    Madeira, Carlos; Zafar, Basit

    2012-01-01

    Using the panel component of the Michigan Survey of Consumers, we show that individuals, in particular women and ethnic minorities, are highly heterogeneous in their expectations of inflation. We estimate a model of inflation expectations based on learning from experience that also allows for heterogeneity in both private information and updating. Our model vastly outperforms existing models of inflation expectations in explaining the heterogeneity in the data. We find that women, ethnic mino...

  16. Higher Order Expectations in Asset Pricing

    OpenAIRE

    Philippe BACCHETTA; Eric VAN WINCOOP

    2004-01-01

    We examine formally Keynes' idea that higher order beliefs can drive a wedge between an asset price and its fundamental value based on expected future payoffs. Higher order expectations add an additional term to a standard asset pricing equation. We call this the higher order wedge, which depends on the difference between higher and first order expectations of future payoffs. We analyze the determinants of this wedge and its impact on the equilibrium price. In the context of a dynamic noisy r...

  17. A Study of Nuclear Recoil Backgrounds in Dark Matter Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Westerdale, Shawn S. [Princeton Univ., NJ (United States)

    2016-01-01

    Despite the great success of the Standard Model of particle physics, a preponderance of astrophysical evidence suggests that it cannot explain most of the matter in the universe. This so-called dark matter has eluded direct detection, though many theoretical extensions to the Standard Model predict the existence of particles with a mass on the $1-1000$ GeV scale that interact only via the weak nuclear force. Particles in this class are referred to as Weakly Interacting Massive Particles (WIMPs), and their high masses and low scattering cross sections make them viable dark matter candidates. The rarity of WIMP-nucleus interactions makes them challenging to detect: any background can mask the signal they produce. Background rejection is therefore a major problem in dark matter detection. Many experiments greatly reduce their backgrounds by employing techniques to reject electron recoils. However, nuclear recoil backgrounds, which produce signals similar to what we expect from WIMPs, remain problematic. There are two primary sources of such backgrounds: surface backgrounds and neutron recoils. Surface backgrounds result from radioactivity on the inner surfaces of the detector sending recoiling nuclei into the detector. These backgrounds can be removed with fiducial cuts, at some cost to the experiment's exposure. In this dissertation we briefly discuss a novel technique for rejecting these events based on signals they make in the wavelength shifter coating on the inner surfaces of some detectors. Neutron recoils result from neutrons scattering from nuclei in the detector. These backgrounds may produce a signal identical to what we expect from WIMPs and are extensively discussed here. We additionally present a new tool for calculating ($\\alpha$, n)yields in various materials. We introduce the concept of a neutron veto system designed to shield against, measure, and provide an anti-coincidence veto signal for background neutrons. We discuss the research and

  18. A study of nuclear recoil backgrounds in dark matter detectors

    Science.gov (United States)

    Westerdale, Shawn S.

    Despite the great success of the Standard Model of particle physics, a preponderance of astrophysical evidence suggests that it cannot explain most of the matter in the universe. This so-called dark matter has eluded direct detection, though many theoretical extensions to the Standard Model predict the existence of particles with a mass on the 1-1000 GeV scale that interact only via the weak nuclear force. Particles in this class are referred to as Weakly Interacting Massive Particles (WIMPs), and their high masses and low scattering cross sections make them viable dark matter candidates. The rarity of WIMP-nucleus interactions makes them challenging to detect: any background can mask the signal they produce. Background rejection is therefore a major problem in dark matter detection. Many experiments greatly reduce their backgrounds by employing techniques to reject electron recoils. However, nuclear recoil backgrounds, which produce signals similar to what we expect from WIMPs, remain problematic. There are two primary sources of such backgrounds: surface backgrounds and neutron recoils. Surface backgrounds result from radioactivity on the inner surfaces of the detector sending recoiling nuclei into the detector. These backgrounds can be removed with fiducial cuts, at some cost to the experiment's exposure. In this dissertation we briefly discuss a novel technique for rejecting these events based on signals they make in the wavelength shifter coating on the inner surfaces of some detectors. Neutron recoils result from neutrons scattering off of nuclei in the detector. These backgrounds may produce a signal identical to what we expect from WIMPs and are extensively discussed here. We additionally present a new tool for calculating (alpha, n) yields in various materials. We introduce the concept of a neutron veto system designed to shield against, measure, and provide an anti-coincidence veto signal for background neutrons. We discuss the research and development

  19. Radiogenic and muon-induced backgrounds in the LUX dark matter detector

    Science.gov (United States)

    Akerib, D. S.; Araújo, H. M.; Bai, X.; Bailey, A. J.; Balajthy, J.; Bernard, E.; Bernstein, A.; Bradley, A.; Byram, D.; Cahn, S. B.; Carmona-Benitez, M. C.; Chan, C.; Chapman, J. J.; Chiller, A. A.; Chiller, C.; Coffey, T.; Currie, A.; de Viveiros, L.; Dobi, A.; Dobson, J.; Druszkiewicz, E.; Edwards, B.; Faham, C. H.; Fiorucci, S.; Flores, C.; Gaitskell, R. J.; Gehman, V. M.; Ghag, C.; Gibson, K. R.; Gilchriese, M. G. D.; Hall, C.; Hertel, S. A.; Horn, M.; Huang, D. Q.; Ihm, M.; Jacobsen, R. G.; Kazkaz, K.; Knoche, R.; Larsen, N. A.; Lee, C.; Lindote, A.; Lopes, M. I.; Malling, D. C.; Mannino, R.; McKinsey, D. N.; Mei, D.-M.; Mock, J.; Moongweluwan, M.; Morad, J.; Murphy, A. St. J.; Nehrkorn, C.; Nelson, H.; Neves, F.; Ott, R. A.; Pangilinan, M.; Parker, P. D.; Pease, E. K.; Pech, K.; Phelps, P.; Reichhart, L.; Shutt, T.; Silva, C.; Solovov, V. N.; Sorensen, P.; O'Sullivan, K.; Sumner, T. J.; Szydagis, M.; Taylor, D.; Tennyson, B.; Tiedt, D. R.; Tripathi, M.; Uvarov, S.; Verbus, J. R.; Walsh, N.; Webb, R.; White, J. T.; Witherell, M. S.; Wolfs, F. L. H.; Woods, M.; Zhang, C.

    2015-03-01

    The Large Underground Xenon (LUX) dark matter experiment aims to detect rare low-energy interactions from Weakly Interacting Massive Particles (WIMPs). The radiogenic backgrounds in the LUX detector have been measured and compared with Monte Carlo simulation. Measurements of LUX high-energy data have provided direct constraints on all background sources contributing to the background model. The expected background rate from the background model for the 85.3 day WIMP search run is (2.6 ±0.2stat ±0.4sys) ×10-3 events keVee-1 kg-1day-1 in a 118 kg fiducial volume. The observed background rate is (3.6 ±0.4stat) ×10-3 events keVee-1 kg-1day-1 , consistent with model projections. The expectation for the radiogenic background in a subsequent one-year run is presented.

  20. On the evaluation of marginal expected shortfall

    DEFF Research Database (Denmark)

    Caporin, Massimiliano; Santucci de Magistris, Paolo

    2012-01-01

    In the analysis of systemic risk, Marginal Expected Shortfall may be considered to evaluate the marginal impact of a single stock on the market Expected Shortfall. These quantities are generally computed using log-returns, in particular when there is also a focus on returns conditional distribution....... In this case, the market log-return is only approximately equal to the weighed sum of equities log-returns. We show that the approximation error is large during turbulent market phases, with a subsequent impact on Marginal Expected Shortfall. We then suggest how to improve the evaluation of Marginal Expected...

  1. 49 CFR 230.24 - Maximum allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...

  2. 20 CFR 226.52 - Total annuity subject to maximum.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Total annuity subject to maximum. 226.52... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Railroad Retirement Family Maximum § 226.52 Total annuity subject to maximum. The total annuity amount which is compared to the maximum monthly amount to...

  3. Half-width at half-maximum, full-width at half-maximum analysis

    Indian Academy of Sciences (India)

    addition to the well-defined parameter full-width at half-maximum (FWHM). The distribution of ... optical side-lobes in the diffraction pattern resulting in steep central maxima [6], reduc- tion of effects of ... and broad central peak. The idea of.

  4. Responsible innovation in human germline gene editing: Background document to the recommendations of ESHG and ESHRE

    NARCIS (Netherlands)

    de Wert, Guido; Heindryckx, Björn; Pennings, Guido; Clarke, Angus; Eichenlaub-Ritter, Ursula; van El, Carla G.; Forzano, Francesca; Goddijn, Mariëtte; Howard, Heidi C.; Radojkovic, Dragica; Rial-Sebbag, Emmanuelle; Dondorp, Wybo; Tarlatzis, Basil C.; Cornel, Martina C.

    2018-01-01

    Technological developments in gene editing raise high expectations for clinical applications, including editing of the germline. The European Society of Human Reproduction and Embryology (ESHRE) and the European Society of Human Genetics (ESHG) together developed a Background document and

  5. Scientific background of the project

    International Nuclear Information System (INIS)

    Christofidis, I.

    1997-01-01

    The main objective of the proposed project is the development of radioimmunometric assay(s) for the determination of free and total PSA in serum samples from normal and pathological individuals (BPH, PCa). This will be achieved by: A. Selection of appropriate antibody pairs (capture and labelled antibody) for determination of total PSA (free and complex) and for determination of free PSA. From bibliography we have already spotted some antibody pairs. B. Radiolabelling of antibodies. Several labelling and purification procedures will be followed in order to obtain the required analytical sensitivity and dynamic range of the assays. Special attention will be given to the affinity constant as well as to the stability of the radiolabelled molecules. C. Development of protocols for immobilisation of capture antibodies. We will use several solid support formats (plastic tubes, beads and magnetizable particles). Direct adsorption or covalent binding will be used. Immunoadsorption through immobilised second antibody will be also tested in order to decrease the preparation cost of the solid phase reagents. D. Preparation of standards of suitable purity levels. We will test different PSA-free matrices (Bovine serum, buffer solutions etc.) in order to select the most appropriate among them in terms of low background determination and low reagents cost. E. Optimisation of the immunoassays conditions for the free PSA and total PSA (e.g. assay buffers, incubation time, temperature, one or two step procedure, washings). F. Optimisation and standardisation of assay protocols for kit production. G. Production of kits for distribution in clinical laboratories in Greece for comparison with commercial kits. H. Evaluation of the developed assays in real clinical conditions using well characterised human serum samples. This will be performed in co-operation with the Hellenic Society for Tumor Markers, and other anticancer institutions and hospital clinicians of long standing relation

  6. NON-EXPECTED UTILITY THEORIES: WEIGHTED EXPECTED, RANK DEPENDENT, AND CUMULATIVE PROSPECT THEORY UTILITY

    OpenAIRE

    Tuthill, Jonathan W.; Frechette, Darren L.

    2002-01-01

    This paper discusses some of the failings of expected utility including the Allais paradox and expected utility's inadequate one dimensional characterization of risk. Three alternatives to expected utility are discussed at length; weighted expected utility, rank dependent utility, and cumulative prospect theory. Each alternative is capable of explaining Allais paradox type problems and permits more sophisticated multi dimensional risk preferences.

  7. Predicting Problem Behaviors with Multiple Expectancies: Expanding Expectancy-Value Theory

    Science.gov (United States)

    Borders, Ashley; Earleywine, Mitchell; Huey, Stanley J.

    2004-01-01

    Expectancy-value theory emphasizes the importance of outcome expectancies for behavioral decisions, but most tests of the theory focus on a single behavior and a single expectancy. However, the matching law suggests that individuals consider expected outcomes for both the target behavior and alternative behaviors when making decisions. In this…

  8. Optimal operating conditions for maximum biogas production in anaerobic bioreactors

    International Nuclear Information System (INIS)

    Balmant, W.; Oliveira, B.H.; Mitchell, D.A.; Vargas, J.V.C.; Ordonez, J.C.

    2014-01-01

    The objective of this paper is to demonstrate the existence of optimal residence time and substrate inlet mass flow rate for maximum methane production through numerical simulations performed with a general transient mathematical model of an anaerobic biodigester introduced in this study. It is herein suggested a simplified model with only the most important reaction steps which are carried out by a single type of microorganisms following Monod kinetics. The mathematical model was developed for a well mixed reactor (CSTR – Continuous Stirred-Tank Reactor), considering three main reaction steps: acidogenesis, with a μ max of 8.64 day −1 and a K S of 250 mg/L, acetogenesis, with a μ max of 2.64 day −1 and a K S of 32 mg/L, and methanogenesis, with a μ max of 1.392 day −1 and a K S of 100 mg/L. The yield coefficients were 0.1-g-dry-cells/g-pollymeric compound for acidogenesis, 0.1-g-dry-cells/g-propionic acid and 0.1-g-dry-cells/g-butyric acid for acetogenesis and 0.1 g-dry-cells/g-acetic acid for methanogenesis. The model describes both the transient and the steady-state regime for several different biodigester design and operating conditions. After model experimental validation, a parametric analysis was performed. It was found that biogas production is strongly dependent on the input polymeric substrate and fermentable monomer concentrations, but fairly independent of the input propionic, acetic and butyric acid concentrations. An optimisation study was then conducted and optimal residence time and substrate inlet mass flow rate were found for maximum methane production. The optima found were very sharp, showing a sudden drop of methane mass flow rate variation from the observed maximum to zero, within a 20% range around the optimal operating parameters, which stresses the importance of their identification, no matter how complex the actual bioreactor design may be. The model is therefore expected to be a useful tool for simulation, design, control and

  9. Predicting problem behaviors with multiple expectancies: expanding expectancy-value theory.

    Science.gov (United States)

    Borders, Ashley; Earleywine, Mitchell; Huey, Stanley J

    2004-01-01

    Expectancy-value theory emphasizes the importance of outcome expectancies for behavioral decisions, but most tests of the theory focus on a single behavior and a single expectancy. However, the matching law suggests that individuals consider expected outcomes for both the target behavior and alternative behaviors when making decisions. In this study, we expanded expectancy-value theory to evaluate the contributions of two competing expectancies to adolescent behavior problems. One hundred twenty-one high school students completed measures of behavior problems, expectancies for both acting out and academic effort, and perceived academic competence. Students' self-reported behavior problems covaried mostly with perceived competence and academic expectancies and only nominally with problem behavior expectancies. We suggest that behavior problems may result from students perceiving a lack of valued or feasible alternative behaviors, such as studying. We discuss implications for interventions and suggest that future research continue to investigate the contribution of alternative expectancies to behavioral decisions.

  10. Comparison of backgrounds in OSO-7 and SMM spectrometers and short-term activation in SMM

    Science.gov (United States)

    Dunphy, P. P.; Forrest, D. J.; Chupp, E. L.; Share, G. H.

    1989-01-01

    The backgrounds in the OSO-7 Gamma-Ray Monitor and the Solar Maximum Mission Gamma-Ray Spectrometer are compared. After scaling to the same volume, the background spectra agree to within 30 percent. This shows that analyses which successfully describe the background in one detector can be applied to similar detectors of different sizes and on different platforms. The background produced in the SMM spectrometer by a single trapped-radiation belt passage is also studied. This background is found to be dominated by a positron-annihilation line and a continuum spectrum with a high energy cutoff at 5 MeV.

  11. Competing expectations. The case of the hydrogen car

    Energy Technology Data Exchange (ETDEWEB)

    Bakker, S.

    2011-04-15

    Firms and governments can support only a limited number of emerging technologies. Some emerging technologies receive support for further development while others are discarded. But how do decision makers in firms and governments assess which of the options earns their support? Straightforward assessments of prices and performance levels can not be sufficient as emerging technologies are, by definition, in an early stage of development and have not reached their maximum levels of performance yet. It is therefore not so much of interest which of the options performs best at any point in time, but rather which of the options will eventually perform best in the future. As a consequence, this competition is based on expectations about future price and performance levels. It is studied how both the relevant decision makers and the technology developers deal with these expectations about the different options. The development of the hydrogen car takes up a central position in this thesis. The hydrogen car is one of the contenders in the race towards 'the car of the future'. While the hydrogen car is indeed in competition with the other contenders there is also competition between different configurations of the hydrogen car. Expectations with regard to the different options are measured through patents, prototype cars, and statements from scientists and car manufacturers. From the research it shows that technology developers, the enactors, do not only try to shape positive expectations about their own option, but also negative expectations about their competitors. Technology selectors assess the credibility of the diverse expectations mainly on the basis of past progress and the possible paths forward towards higher levels of performance and lower prices. The roles of enactor and selector are interrelated and so is the process of enaction and selection. A main conclusion to this thesis is that selectors tend to narrow their portfolios in times of low general

  12. Losing ground--Swedish life expectancy in a comparative perspective.

    Directory of Open Access Journals (Sweden)

    Sven Drefahl

    Full Text Available BACKGROUND: In the beginning of the 1970s, Sweden was the country where both women and men enjoyed the world's longest life expectancy. While life expectancy continues to be high and increasing, Sweden has been losing ground in relation to other leading countries. METHODS: We look at life expectancy over the years 1970-2008 for men and women. To assess the relative contributions of age, causes of death, and smoking we decompose differences in life expectancy between Sweden and two leading countries, Japan and France. This study is the first to use this decomposition method to observe how smoking related deaths contribute to life expectancy differences between countries. RESULTS: Sweden has maintained very low mortality at young and working ages for both men and women compared to France and Japan. However, mortality at ages above 65 has become considerably higher in Sweden than in the other leading countries because the decrease has been faster in those countries. Different trends for circulatory diseases were the largest contributor to this development in both sexes but for women also cancer played a role. Mortality from neoplasms has been considerably low for Swedish men. Smoking attributable mortality plays a modest role for women, whereas it is substantially lower in Swedish men than in French and Japanese men. CONCLUSIONS: Sweden is losing ground in relation to other leading countries with respect to life expectancy because mortality at high ages improves more slowly than in the leading countries, especially due to trends in cardiovascular disease mortality. Trends in smoking rates may provide a partial explanation for the trends in women; however, it is not possible to isolate one single explanatory factor for why Sweden is losing ground.

  13. Using Daily Horoscopes To Demonstrate Expectancy Confirmation.

    Science.gov (United States)

    Munro, Geoffrey D.; Munro, James E.

    2000-01-01

    Describes a classroom demonstration that uses daily horoscopes to show the effect that expectation can have on judgment. Addresses the preparation, procedure, and results of the demonstration, and student evaluations. States that the demonstration appears to be effective for teaching students about expectancy confirmation. (CMK)

  14. Do Students Expect Compensation for Wage Risk?

    Science.gov (United States)

    Schweri, Juerg; Hartog, Joop; Wolter, Stefan C.

    2011-01-01

    We use a unique data set about the wage distribution that Swiss students expect for themselves ex ante, deriving parametric and non-parametric measures to capture expected wage risk. These wage risk measures are unfettered by heterogeneity which handicapped the use of actual market wage dispersion as risk measure in earlier studies. Students in…

  15. Expectancy Theory in Media and Message Selection.

    Science.gov (United States)

    Van Leuven, Jim

    1981-01-01

    Argues for reversing emphasis on uses and gratifications research in favor of an expectancy model which holds that selection of a particular medium depends on (1) the expectation that the choice will be followed by a message of interest and (2) the importance of that message in satisfying user's values. (PD)

  16. Memory, expectation formation and scheduling choices

    NARCIS (Netherlands)

    Koster, P.R.; Peer, S.; Dekker, T.

    2015-01-01

    Limited memory capacity, retrieval constraints and anchoring are central to expectation formation processes. We develop a model of adaptive expectations where individuals are able to store only a finite number of past experiences of a stochastic state variable. Retrieval of these experiences is

  17. Socioeconomic differences in health expectancy in Denmark

    DEFF Research Database (Denmark)

    Brønnum-Hansen, Henrik

    2000-01-01

    Social differences in mortality rates reported in Denmark gave rise to the present study of health expectancy in different socioeconomic groups.......Social differences in mortality rates reported in Denmark gave rise to the present study of health expectancy in different socioeconomic groups....

  18. Video lottery: winning expectancies and arousal.

    Science.gov (United States)

    Ladouceur, Robert; Sévigny, Serge; Blaszczynski, Alexander; O'Connor, Kieron; Lavoie, Marc E

    2003-06-01

    This study investigates the effects of video lottery players' expectancies of winning on physiological and subjective arousal. Participants were assigned randomly to one of two experimental conditions: high and low winning expectancies. Participants played 100 video lottery games in a laboratory setting while physiological measures were recorded. Level of risk-taking was controlled. Participants were 34 occasional or regular video lottery players. They were assigned randomly into two groups of 17, with nine men and eight women in each group. The low-expectancy group played for fun, therefore expecting to win worthless credits, while the high-expectancy group played for real money. Players' experience, demographic variables and subjective arousal were assessed. Severity of problem gambling was measured with the South Oaks Gambling Screen. In order to measure arousal, the average heart rate was recorded across eight periods. Participants exposed to high as compared to low expectations experienced faster heart rate prior to and during the gambling session. According to self-reports, it is the expectancy of winning money that is exciting, not playing the game. Regardless of the level of risk-taking, expectancy of winning is a cognitive factor influencing levels of arousal. When playing for fun, gambling becomes significantly less stimulating than when playing for money.

  19. Expected Business Conditions and Bond Risk Premia

    DEFF Research Database (Denmark)

    Eriksen, Jonas Nygaard

    2017-01-01

    In this article, I study the predictability of bond risk premia by means of expectations to future business conditions using survey forecasts from the Survey of Professional Forecasters. I show that expected business conditions consistently affect excess bond returns and that the inclusion of exp...

  20. Smoking expands expected lifetime with musculoskeletal disease

    DEFF Research Database (Denmark)

    Brønnum-Hansen, Henrik; Juel, Knud

    2003-01-01

    By indirect estimation of mortality from smoking and life table methods we estimated expected lifetime without musculoskeletal diseases among never smokers, ex-smokers, and smokers. We found that although life expectancy of a heavy smoker is 7 years shorter than that of a never smoker, heavy...

  1. Test Expectancy and Memory for Important Information

    Science.gov (United States)

    Middlebrooks, Catherine D.; Murayama, Kou; Castel, Alan D.

    2017-01-01

    Prior research suggests that learners study and remember information differently depending upon the type of test they expect to later receive. The current experiments investigate how testing expectations impact the study of and memory for valuable information. Participants studied lists of words ranging in value from 1 to 10 points with the goal…

  2. Expectation formation in dynamic market experiments

    NARCIS (Netherlands)

    Heemeijer, P.

    2009-01-01

    People often make mistakes when predicting economic variables such as prices. It is important to understand how these predictions are formed, since people's expectations have a large impact on the development and stability of economic systems. In this thesis the expectation formation of individuals

  3. Expectation of recovery from low back pain

    DEFF Research Database (Denmark)

    Kongsted, Alice; Vach, Werner; Axø, Marie

    2014-01-01

    Study Design. A prospective cohort study conducted in general practice (GP) and chiropractic practice (CP).Objectives. To explore which patient characteristics were associated with recovery expectations in low back pain (LBP) patients, whether expectations predicted 3-month outcome, and to what...

  4. Perceptual grouping effects on cursor movement expectations.

    Science.gov (United States)

    Dorneich, Michael C; Hamblin, Christopher J; Lancaster, Jeff A; Olofinboba, Olu

    2014-05-01

    Two studies were conducted to develop an understanding of factors that drive user expectations when navigating between discrete elements on a display via a limited degree-of-freedom cursor control device. For the Orion Crew Exploration Vehicle spacecraft, a free-floating cursor with a graphical user interface (GUI) would require an unachievable level of accuracy due to expected acceleration and vibration conditions during dynamic phases of flight. Therefore, Orion program proposed using a "caged" cursor to "jump" from one controllable element (node) on the GUI to another. However, nodes are not likely to be arranged on a rectilinear grid, and so movements between nodes are not obvious. Proximity between nodes, direction of nodes relative to each other, and context features may all contribute to user cursor movement expectations. In an initial study, we examined user expectations based on the nodes themselves. In a second study, we examined the effect of context features on user expectations. The studies established that perceptual grouping effects influence expectations to varying degrees. Based on these results, a simple rule set was developed to support users in building a straightforward mental model that closely matches their natural expectations for cursor movement. The results will help designers of display formats take advantage of the natural context-driven cursor movement expectations of users to reduce navigation errors, increase usability, and decrease access time. The rules set and guidelines tie theory to practice and can be applied in environments where vibration or acceleration are significant, including spacecraft, aircraft, and automobiles.

  5. Subjective Expected Utility Theory with "Small Worlds"

    DEFF Research Database (Denmark)

    Gyntelberg, Jacob; Hansen, Frank

    which is a more general construction than a state space. We retain preference axioms similar in spirit to the Savage axioms and obtain, without abandoning linearity of expectations, a subjective expected utility theory which allows for an intuitive distinction between risk and uncertainty. We also...

  6. Expectations as a key element in trusting

    DEFF Research Database (Denmark)

    Rasmussen, Mette Apollo; Hansen, Uffe Kjærgaard; Conradsen, Maria Bosse

    Considering the need for a tangible focus for qualitative research on trusting, we propose that expectations to the behavior of others can provide that. By focusing on expectations, researchers can produce narrative descriptions that explains how trusting develops and changes. Then the key theore...

  7. Grief Experiences and Expectance of Suicide

    Science.gov (United States)

    Wojtkowiak, Joanna; Wild, Verena; Egger, Jos

    2012-01-01

    Suicide is generally viewed as an unexpected cause of death. However, some suicides might be expected to a certain extent, which needs to be further studied. The relationships between expecting suicide, feeling understanding for the suicide, and later grief experiences were explored. In total, 142 bereaved participants completed the Grief…

  8. International Variations in Measuring Customer Expectations.

    Science.gov (United States)

    Calvert, Philip J.

    2001-01-01

    Discussion of customer expectations of library service quality and SERVQUAL as a measurement tool focuses on two studies: one that compared a survey of Chinese university students' expectations of service quality to New Zealand students; and one that investigated national culture as a source of attitudes to customer service. (Author/LRW)

  9. Background sources and masks for Mark II detector at PEP

    International Nuclear Information System (INIS)

    Kadyk, J.

    1981-06-01

    The shielding masks currently at use in several of the current experiments at PEP are the result of an early organized effort to understand the sources of particle background expected at PEP, followed by the evolution of the conceptual designs into actual hardware. The degree and kind of background particle loading which could be tolerated was expected to differ significantly among the different experiments, and several designs emerged from the common study. Qualitatively, the types of radiations studied were, Synchrotron Radiation (SR), Beam Gas Bremsstrahlung (BGB), and, to a limited extent others, e.g., Electroproduction (EP). Calculations will be given of predicted occupancies in the pipe counter and other sensitive elements at small radius, since these will be most susceptible to the SR and BGB backgrounds. The calculations presented in this note are specific to the Mark II detector. Some general statements will be made first about the character of each of the various types of backgrounds considered, then some detailed calculations made for application to the Mark II detector

  10. Multilevel survival analysis of health inequalities in life expectancy

    Directory of Open Access Journals (Sweden)

    Merlo Juan

    2009-08-01

    Full Text Available Abstract Background The health status of individuals is determined by multiple factors operating at both micro and macro levels and the interactive effects of them. Measures of health inequalities should reflect such determinants explicitly through sources of levels and combining mean differences at group levels and the variation of individuals, for the benefits of decision making and intervention planning. Measures derived recently from marginal models such as beta-binomial and frailty survival, address this issue to some extent, but are limited in handling data with complex structures. Beta-binomial models were also limited in relation to measuring inequalities of life expectancy (LE directly. Methods We propose a multilevel survival model analysis that estimates life expectancy based on survival time with censored data. The model explicitly disentangles total health inequalities in terms of variance components of life expectancy compared to the source of variation at the level of individuals in households and parishes and so on, and estimates group differences of inequalities at the same time. Adjusted distributions of life expectancy by gender and by household socioeconomic level are calculated. Relative and absolute health inequality indices are derived based on model estimates. The model based analysis is illustrated on a large Swedish cohort of 22,680 men and 26,474 women aged 65–69 in 1970 and followed up for 30 years. Model based inequality measures are compared to the conventional calculations. Results Much variation of life expectancy is observed at individual and household levels. Contextual effects at Parish and Municipality level are negligible. Women have longer life expectancy than men and lower inequality. There is marked inequality by the level of household socioeconomic status measured by the median life expectancy in each socio-economic group and the variation in life expectancy within each group. Conclusion Multilevel

  11. Development and Performance of Detectors for the Cryogenic Dark Matter Search Experiment with an Increased Sensitivity Based on a Maximum Likelihood Analysis of Beta Contamination

    Energy Technology Data Exchange (ETDEWEB)

    Driscoll, Donald D [Case Western Reserve Univ., Cleveland, OH (United States)

    2004-05-01

    The Cryogenic Dark Matter Search (CDMS) uses cryogenically-cooled detectors made of germanium and silicon in an attempt to detect dark matter in the form of Weakly-Interacting Massive Particles (WIMPs). The expected interaction rate of these particles is on the order of 1/kg/day, far below the 200/kg/day expected rate of background interactions after passive shielding and an active cosmic ray muon veto. Our detectors are instrumented to make a simultaneous measurement of both the ionization energy and thermal energy deposited by the interaction of a particle with the crystal substrate. A comparison of these two quantities allows for the rejection of a background of electromagnetically-interacting particles at a level of better than 99.9%. The dominant remaining background at a depth of ~ 11 m below the surface comes from fast neutrons produced by cosmic ray muons interacting in the rock surrounding the experiment. Contamination of our detectors by a beta emitter can add an unknown source of unrejected background. In the energy range of interest for a WIMP study, electrons will have a short penetration depth and preferentially interact near the surface. Some of the ionization signal can be lost to the charge contacts there and a decreased ionization signal relative to the thermal signal will cause a background event which interacts at the surface to be misidentified as a signal event. We can use information about the shape of the thermal signal pulse to discriminate against these surface events. Using a subset of our calibration set which contains a large fraction of electron events, we can characterize the expected behavior of surface events and construct a cut to remove them from our candidate signal events. This thesis describes the development of the 6 detectors (4 x 250 g Ge and 2 x 100 g Si) used in the 2001-2002 CDMS data run at the Stanford Underground Facility with a total of 119 livedays of data. The preliminary results presented are based on the first use

  12. Assessment of Radiation Background Variation for Moving Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Miller, James Christopher [Los Alamos National Laboratory; Rennie, John Alan [Los Alamos National Laboratory; Toevs, James Waldo [Los Alamos National Laboratory; Wallace, Darrin J. [Los Alamos National Laboratory; Abhold, Mark Edward [Los Alamos National Laboratory

    2015-07-13

    The introduction points out that radiation backgrounds fluctuate across very short distances: factors include geology, soil composition, altitude, building structures, topography, and other manmade structures; and asphalt and concrete can vary significantly over short distances. Brief descriptions are given of the detection system, experimental setup, and background variation measurements. It is concluded that positive and negative gradients can greatly reduce the detection sensitivity of an MDS: negative gradients create opportunities for false negatives (nondetection), and positive gradients create a potentially unacceptable FAR (above 1%); the location of use for mobile detection is important to understand; spectroscopic systems provide more information for screening out false alarms and may be preferred for mobile use; and mobile monitor testing at LANL accounts for expected variations in the background.

  13. Memory for expectation-violating concepts

    DEFF Research Database (Denmark)

    Porubanova, Michaela; Shaw, Daniel; McKay, Ryan

    2014-01-01

    Previous research has shown that ideas which violate our expectations, such as schema-inconsistent concepts, enjoy privileged status in terms of memorability. In our study, memory for concepts that violate cultural (cultural schema-level) expectations (e.g., ‘‘illiterate teacher’’, ‘‘wooden bottle...... expectations and with intuitive concepts (e.g., ‘‘galloping pony’’, ‘‘drying orchid’’, or ‘‘convertible car’’), in both immediate recall, and delayed recognition tests. Importantly, concepts related to agents showed a memory advantage over concepts not pertaining to agents, but this was true only...... for expectation-violating concepts. Our results imply that intuitive, everyday concepts are equally attractive and memorable regardless of the presence or absence of agents. However, concepts that violate our expectations (cultural-schema or domain-level) are more memorable when pertaining to agents (humans...

  14. Whites but Not Blacks Gain Life Expectancy from Social Contacts

    Directory of Open Access Journals (Sweden)

    Shervin Assari

    2017-10-01

    Full Text Available Background. Recent research suggests that the health gain from economic resources and psychological assets may be systematically larger for Whites than Blacks. Aim. This study aimed to assess whether the life expectancy gain associated with social contacts over a long follow up differs for Blacks and Whites. Methods. Data came from the Americans’ Changing Lives (ACL Study, 1986–2011. The sample was a nationally representative sample of American adults 25 and older, who were followed for up to 25 years (n = 3361. Outcome was all-cause mortality. The main predictor was social contacts defined as number of regular visits with friends, relatives, and neighbors. Baseline demographics (age and gender, socioeconomic status (education, income, and employment, health behaviors (smoking and drinking, and health (chronic medical conditions, obesity, and depressive symptoms were controlled. Race was the focal moderator. Cox proportional hazard models were used in the pooled sample and based on race. Results. More social contacts predicted higher life expectancy in the pooled sample. A significant interaction was found between race and social contacts, suggesting that the protective effect of more social contacts is smaller for Blacks than Whites. In stratified models, more social contacts predicted an increased life expectancy for Whites but not Blacks. Conclusion. Social contacts increase life expectancy for White but not Black Americans. This study introduces social contacts as another social resource that differentially affects health of Whites and Blacks.

  15. Chemical Composition of Fine Particulate Matter and Life Expectancy

    Science.gov (United States)

    Dominici, Francesca; Wang, Yun; Correia, Andrew W.; Ezzati, Majid; Pope, C. Arden; Dockery, Douglas W.

    2016-01-01

    Background In a previous study, we provided evidence that a decline in fine particulate matter (PM2.5) air pollution during the period between 2000 and 2007 was associated with increased life expectancy in 545 counties in the United States. In this article, we investigated which chemical constituents of PM2.5 were the main drivers of the observed association. Methods We estimated associations between temporal changes in seven major components of PM2.5 (ammonium, sulfate, nitrate, elemental carbon matter, organic carbon matter, sodium, and silicon) and temporal changes in life expectancy in 95 counties between 2002 and 2007. We included US counties that had adequate chemical components of PM2.5 mass data across all seasons. We fitted single pollutant and multiple pollutant linear models, controlling for available socioeconomic, demographic, and smoking variables and stratifying by urban and nonurban counties. Results In multiple pollutant models, we found that: (1) a reduction in sulfate was associated with an increase in life expectancy; and (2) reductions in ammonium and sodium ion were associated with increases in life expectancy in nonurban counties only. Conclusions Our findings suggest that recent reductions in long-term exposure to sulfate, ammonium, and sodium ion between 2002 and 2007 are associated with improved public health. PMID:25906366

  16. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d 2 ], where t is the time and d is dose. The coefficient of the d 2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  17. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L.; DuFrain, R.J.

    1986-01-01

    In vitro dose-response curves are used to describe the relation between chromosome aberrations and radiation dose for human lymphocytes. The lymphocytes are exposed to low-LET radiation, and the resulting dicentric chromosome aberrations follow the Poisson distribution. The expected yield depends on both the magnitude and the temporal distribution of the dose. A general dose-response model that describes this relation has been presented by Kellerer and Rossi (1972, Current Topics on Radiation Research Quarterly 8, 85-158; 1978, Radiation Research 75, 471-488) using the theory of dual radiation action. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting dose-time-response models are intrinsically nonlinear in the parameters. A general-purpose maximum likelihood estimation procedure is described, and estimation for the nonlinear models is illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  18. Installation of the MAXIMUM microscope at the ALS

    International Nuclear Information System (INIS)

    Ng, W.; Perera, R.C.C.; Underwood, J.H.; Singh, S.; Solak, H.; Cerrina, F.

    1995-10-01

    The MAXIMUM scanning x-ray microscope, developed at the Synchrotron Radiation Center (SRC) at the University of Wisconsin, Madison was implemented on the Advanced Light Source in August of 1995. The microscope's initial operation at SRC successfully demonstrated the use of multilayer coated Schwarzschild objective for focusing 130 eV x-rays to a spot size of better than 0.1 micron with an electron energy resolution of 250meV. The performance of the microscope was severely limited, because of the relatively low brightness of SRC, which limits the available flux at the focus of the microscope. The high brightness of the ALS is expected to increase the usable flux at the sample by a factor of 1,000. The authors will report on the installation of the microscope on bending magnet beamline 6.3.2 at the ALS and the initial measurement of optical performance on the new source, and preliminary experiments with surface chemistry of HF etched Si will be described

  19. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  20. Feedback Limits to Maximum Seed Masses of Black Holes

    International Nuclear Information System (INIS)

    Pacucci, Fabio; Natarajan, Priyamvada; Ferrara, Andrea

    2017-01-01

    The most massive black holes observed in the universe weigh up to ∼10 10 M ⊙ , nearly independent of redshift. Reaching these final masses likely required copious accretion and several major mergers. Employing a dynamical approach that rests on the role played by a new, relevant physical scale—the transition radius—we provide a theoretical calculation of the maximum mass achievable by a black hole seed that forms in an isolated halo, one that scarcely merged. Incorporating effects at the transition radius and their impact on the evolution of accretion in isolated halos, we are able to obtain new limits for permitted growth. We find that large black hole seeds ( M • ≳ 10 4 M ⊙ ) hosted in small isolated halos ( M h ≲ 10 9 M ⊙ ) accreting with relatively small radiative efficiencies ( ϵ ≲ 0.1) grow optimally in these circumstances. Moreover, we show that the standard M • – σ relation observed at z ∼ 0 cannot be established in isolated halos at high- z , but requires the occurrence of mergers. Since the average limiting mass of black holes formed at z ≳ 10 is in the range 10 4–6 M ⊙ , we expect to observe them in local galaxies as intermediate-mass black holes, when hosted in the rare halos that experienced only minor or no merging events. Such ancient black holes, formed in isolation with subsequent scant growth, could survive, almost unchanged, until present.