WorldWideScience

Sample records for correctly predicts basic

  1. Publisher Correction: Predicting unpredictability

    Davis, Steven J.

    2018-06-01

    In this News & Views article originally published, the wrong graph was used for panel b of Fig. 1, and the numbers on the y axes of panels a and c were incorrect; the original and corrected Fig. 1 is shown below. This has now been corrected in all versions of the News & Views.

  2. A program in BASIC for calclation of cavity theory corrections

    Bugge Christensen, E.; Miller, A.

    1982-05-01

    A program in BASIC for a desk-top calculator HP 9830A is described. The program allows calculation of cavity theory corrections according to Burlin's general cavity theory. The calculations are made by using tabulated values for stopping powers and energy absorption coefficients, stored either as coefficients to a fitted polynomial or as the actual table data. (author)

  3. Neural networks to predict exosphere temperature corrections

    Choury, Anna; Bruinsma, Sean; Schaeffer, Philippe

    2013-10-01

    Precise orbit prediction requires a forecast of the atmospheric drag force with a high degree of accuracy. Artificial neural networks are universal approximators derived from artificial intelligence and are widely used for prediction. This paper presents a method of artificial neural networking for prediction of the thermosphere density by forecasting exospheric temperature, which will be used by the semiempirical thermosphere Drag Temperature Model (DTM) currently developed. Artificial neural network has shown to be an effective and robust forecasting model for temperature prediction. The proposed model can be used for any mission from which temperature can be deduced accurately, i.e., it does not require specific training. Although the primary goal of the study was to create a model for 1 day ahead forecast, the proposed architecture has been generalized to 2 and 3 days prediction as well. The impact of artificial neural network predictions has been quantified for the low-orbiting satellite Gravity Field and Steady-State Ocean Circulation Explorer in 2011, and an order of magnitude smaller orbit errors were found when compared with orbits propagated using the thermosphere model DTM2009.

  4. Hydrogen Bond Basicity Prediction for Medicinal Chemistry Design.

    Kenny, Peter W; Montanari, Carlos A; Prokopczyk, Igor M; Ribeiro, Jean F R; Sartori, Geraldo Rodrigues

    2016-05-12

    Hydrogen bonding is discussed in the context of medicinal chemistry design. Minimized molecular electrostatic potential (Vmin) is shown to be an effective predictor of hydrogen bond basicity (pKBHX), and predictive models are presented for a number of hydrogen bond acceptor types relevant to medicinal chemistry. The problems posed by the presence of nonequivalent hydrogen bond acceptor sites in molecular structures are addressed by using nonlinear regression to fit measured pKBHX to calculated Vmin. Predictions are made for hydrogen bond basicity of fluorine in situations where relevant experimental measurements are not available. It is shown how predicted pKBHX can be used to provide insight into the nature of bioisosterism and to profile heterocycles. Examples of pKBHX prediction for molecular structures with multiple, nonequivalent hydrogen bond acceptors are presented.

  5. Innovation in prediction planning for anterior open bite correction.

    Almuzian, Mohammed; Almukhtar, Anas; O'Neil, Michael; Benington, Philip; Al Anezi, Thamer; Ayoub, Ashraf

    2015-05-01

    This study applies recent advances in 3D virtual imaging for application in the prediction planning of dentofacial deformities. Stereo-photogrammetry has been used to create virtual and physical models, which are creatively combined in planning the surgical correction of anterior open bite. The application of these novel methods is demonstrated through the surgical correction of a case.

  6. The Effect of Corrective Feedback on Performance in Basic Cognitive Tasks: An Analysis of RT Components

    Carmen Moret-Tatay

    2016-12-01

    Full Text Available The current work examines the effect of trial-by-trial feedback about correct and error responding on performance in two basic cognitive tasks: a classic Stroop task (n = 40 and a color-word matching task ('n' = 30. Standard measures of both RT and accuracy were examined in addition to measures obtained from fitting the ex-Gaussian distributional model to the correct RTs. For both tasks, RTs were faster in blocks of trials with feedback than in blocks without feedback, but this difference was not significant. On the other hand, with respect to the distributional analyses, providing feedback served to significantly reduce the size of the tails of the RT distributions. Such results suggest that, for conditions in which accuracy is fairly high, the effect of corrective feedback might either be to reduce the tendency to double-check before responding or to decrease the amount of attentional lapsing.

  7. Hypothesis, Prediction, and Conclusion: Using Nature of Science Terminology Correctly

    Eastwell, Peter

    2012-01-01

    This paper defines the terms "hypothesis," "prediction," and "conclusion" and shows how to use the terms correctly in scientific investigations in both the school and science education research contexts. The scientific method, or hypothetico-deductive (HD) approach, is described and it is argued that an understanding of the scientific method,…

  8. Basic prediction techniques in modern video coding standards

    Kim, Byung-Gyu

    2016-01-01

    This book discusses in detail the basic algorithms of video compression that are widely used in modern video codec. The authors dissect complicated specifications and present material in a way that gets readers quickly up to speed by describing video compression algorithms succinctly, without going to the mathematical details and technical specifications. For accelerated learning, hybrid codec structure, inter- and intra- prediction techniques in MPEG-4, H.264/AVC, and HEVC are discussed together. In addition, the latest research in the fast encoder design for the HEVC and H.264/AVC is also included.

  9. Basic disturbances of information processing in psychosis prediction.

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  10. Incorporation of QCD effects in basic corrections of the electroweak theory

    Fanchiotti, Sergio; Sirlin, Alberto; Fanchiotti, Sergio; Kniehl, Bernd; Sirlin, Alberto

    1993-01-01

    We study the incorporation of QCD effects in the basic electroweak corrections \\drcar, \\drcarw, and \\dr. They include perturbative \\Ord{\\alpha\\alpha_s} contributions and $t\\bar{t}$ threshold effects. The latter are studied in the resonance and Green-function approaches, in the framework of dispersion relations that automatically satisfy relevant Ward identities. Refinements in the treatment of the electroweak corrections, in both the \\ms\\ and the on-shell schemes of renormalization, are introduced, including the decoupling of the top quark in certain amplitudes, its effect on $\\hat{e}^2(\\mz)$ and \\sincarmz, the incorporation of recent results on the leading irreducible \\Ord{\\alpha^2} corrections, and simple expressions for the residual, i.e.\\ ``non-electromagnetic'', parts of \\drcar, \\drcarw, and \\dr. The results are used to obtain accurate values for \\mw\\ and \\sincarmz, as functions of \\mt\\ and \\mh. The higher-order effects induce shifts in these parameters comparable to the expected experimental accuracy, a...

  11. Measurement Error Correction for Predicted Spatiotemporal Air Pollution Exposures.

    Keller, Joshua P; Chang, Howard H; Strickland, Matthew J; Szpiro, Adam A

    2017-05-01

    Air pollution cohort studies are frequently analyzed in two stages, first modeling exposure then using predicted exposures to estimate health effects in a second regression model. The difference between predicted and unobserved true exposures introduces a form of measurement error in the second stage health model. Recent methods for spatial data correct for measurement error with a bootstrap and by requiring the study design ensure spatial compatibility, that is, monitor and subject locations are drawn from the same spatial distribution. These methods have not previously been applied to spatiotemporal exposure data. We analyzed the association between fine particulate matter (PM2.5) and birth weight in the US state of Georgia using records with estimated date of conception during 2002-2005 (n = 403,881). We predicted trimester-specific PM2.5 exposure using a complex spatiotemporal exposure model. To improve spatial compatibility, we restricted to mothers residing in counties with a PM2.5 monitor (n = 180,440). We accounted for additional measurement error via a nonparametric bootstrap. Third trimester PM2.5 exposure was associated with lower birth weight in the uncorrected (-2.4 g per 1 μg/m difference in exposure; 95% confidence interval [CI]: -3.9, -0.8) and bootstrap-corrected (-2.5 g, 95% CI: -4.2, -0.8) analyses. Results for the unrestricted analysis were attenuated (-0.66 g, 95% CI: -1.7, 0.35). This study presents a novel application of measurement error correction for spatiotemporal air pollution exposures. Our results demonstrate the importance of spatial compatibility between monitor and subject locations and provide evidence of the association between air pollution exposure and birth weight.

  12. "When does making detailed predictions make predictions worse?": Correction to Kelly and Simmons (2016).

    2016-10-01

    Reports an error in "When Does Making Detailed Predictions Make Predictions Worse" by Theresa F. Kelly and Joseph P. Simmons ( Journal of Experimental Psychology: General , Advanced Online Publication, Aug 8, 2016, np). In the article, the symbols in Figure 2 were inadvertently altered in production. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2016-37952-001.) In this article, we investigate whether making detailed predictions about an event worsens other predictions of the event. Across 19 experiments, 10,896 participants, and 407,045 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes useless or redundant information more accessible and thus more likely to be incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of events will and will not be susceptible to the negative effect of making detailed predictions. PsycINFO Database Record (c) 2016 APA, all rights reserved

  13. Correction

    Pinkevych, Mykola; Cromer, Deborah; Tolstrup, Martin

    2016-01-01

    [This corrects the article DOI: 10.1371/journal.ppat.1005000.][This corrects the article DOI: 10.1371/journal.ppat.1005740.][This corrects the article DOI: 10.1371/journal.ppat.1005679.].......[This corrects the article DOI: 10.1371/journal.ppat.1005000.][This corrects the article DOI: 10.1371/journal.ppat.1005740.][This corrects the article DOI: 10.1371/journal.ppat.1005679.]....

  14. Basic Modelling principles and Validation of Software for Prediction of Collision Damage

    Simonsen, Bo Cerup

    2000-01-01

    This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software.......This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software....

  15. BASIC

    Hansen, Pelle Guldborg; Schmidt, Karsten

    2017-01-01

    De sidste 10 år har vi været vidner til opkomsten af et nyt evidensbaseret policy paradigme, Behavioural Public Policy (BPP), der søger at integrere teoretiske og metodiske indsigter fra adfærdsvidenskaberne i offentlig politikudvikling. Arbejdet med BPP har dog båret præg af, at være usystematisk...... BPP. Tilgangen består dels af den overordnede proces-model BASIC og dels af et iboende framework, ABCD, der er en model for systematisk adfærdsanalyse, udvikling, test og implementering af adfærdsrettede løsningskoncepter. Den samlede model gør det muligt for forskere såvel som offentligt ansatte...

  16. Neither Basic Life Support knowledge nor self-efficacy are predictive of skills among dental students.

    Mac Giolla Phadraig, C; Ho, J D; Guerin, S; Yeoh, Y L; Mohamed Medhat, M; Doody, K; Hwang, S; Hania, M; Boggs, S; Nolan, A; Nunn, J

    2017-08-01

    Basic life support (BLS) is considered a core competence for the graduating dentist. This study aimed to measure BLS knowledge, self-efficacy and skills of undergraduate dental students in Dublin. This study consisted of a cross-sectional survey measuring BLS knowledge and self-efficacy, accompanied by a directly observed BLS skills assessment in a subsample of respondents. Data were collected in January 2014. Bivariate correlations between descriptive and outcome variables (knowledge, self-efficacy and skills) were tested using Pearson's chi-square. We included knowledge and self-efficacy as predictor variables, along with other variables showing association, into a binary logistic regression model with BLS skills as the outcome measure. One hundred and thirty-five students participated. Almost all (n = 133, 98.5%) participants had BLS training within the last 2 years. One hundred and four (77%) felt that they were capable of providing effective BLS (self-efficacy), whilst only 46 (34.1%) scored >80% of knowledge items correct. Amongst the skills (n = 85) subsample, 38.8% (n = 33) were found to pass the BLS skills assessment. Controlling for gender, age and skills assessor, the regression model did not identify a predictive relationship between knowledge or self-efficacy and BLS skills. Neither knowledge nor self-efficacy was predictive of BLS skills. Dental students had low levels of knowledge and skills in BLS. Despite this, their confidence in their ability to perform BLS was high and did not predict actual competence. There is a need for additional hands-on training, focusing on self-efficacy and BLS skills, particularly the use of AED. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Genomic Signal Processing: Predicting Basic Molecular Biological Principles

    Alter, Orly

    2005-03-01

    Advances in high-throughput technologies enable acquisition of different types of molecular biological data, monitoring the flow of biological information as DNA is transcribed to RNA, and RNA is translated to proteins, on a genomic scale. Future discovery in biology and medicine will come from the mathematical modeling of these data, which hold the key to fundamental understanding of life on the molecular level, as well as answers to questions regarding diagnosis, treatment and drug development. Recently we described data-driven models for genome-scale molecular biological data, which use singular value decomposition (SVD) and the comparative generalized SVD (GSVD). Now we describe an integrative data-driven model, which uses pseudoinverse projection (1). We also demonstrate the predictive power of these matrix algebra models (2). The integrative pseudoinverse projection model formulates any number of genome-scale molecular biological data sets in terms of one chosen set of data samples, or of profiles extracted mathematically from data samples, designated the ``basis'' set. The mathematical variables of this integrative model, the pseudoinverse correlation patterns that are uncovered in the data, represent independent processes and corresponding cellular states (such as observed genome-wide effects of known regulators or transcription factors, the biological components of the cellular machinery that generate the genomic signals, and measured samples in which these regulators or transcription factors are over- or underactive). Reconstruction of the data in the basis simulates experimental observation of only the cellular states manifest in the data that correspond to those of the basis. Classification of the data samples according to their reconstruction in the basis, rather than their overall measured profiles, maps the cellular states of the data onto those of the basis, and gives a global picture of the correlations and possibly also causal coordination of

  18. Assessing the implementation of bias correction in the climate prediction

    Nadrah Aqilah Tukimat, Nurul

    2018-04-01

    An issue of the climate changes nowadays becomes trigger and irregular. The increment of the greenhouse gases (GHGs) emission into the atmospheric system day by day gives huge impact to the fluctuated weather and global warming. It becomes significant to analyse the changes of climate parameters in the long term. However, the accuracy in the climate simulation is always be questioned to control the reliability of the projection results. Thus, the Linear Scaling (LS) as a bias correction method (BC) had been applied to treat the gaps between observed and simulated results. About two rainfall stations were selected in Pahang state there are Station Lubuk Paku and Station Temerloh. Statistical Downscaling Model (SDSM) used to perform the relationship between local weather and atmospheric parameters in projecting the long term rainfall trend. The result revealed the LS was successfully to reduce the error up to 3% and produced better climate simulated results.

  19. A hybrid approach to advancing quantitative prediction of tissue distribution of basic drugs in human

    Poulin, Patrick; Ekins, Sean; Theil, Frank-Peter

    2011-01-01

    A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V ss ) in humans under in vivo conditions. This correlation method demonstrated inaccurate predictions of V ss for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V ss of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.

  20. A two-dimensional matrix correction for off-axis portal dose prediction errors

    Bailey, Daniel W.; Kumaraswamy, Lalith; Bakhtiari, Mohammad; Podgorsak, Matthew B.

    2013-01-01

    Purpose: This study presents a follow-up to a modified calibration procedure for portal dosimetry published by Bailey et al. [“An effective correction algorithm for off-axis portal dosimetry errors,” Med. Phys. 36, 4089–4094 (2009)]. A commercial portal dose prediction system exhibits disagreement of up to 15% (calibrated units) between measured and predicted images as off-axis distance increases. The previous modified calibration procedure accounts for these off-axis effects in most regions of the detecting surface, but is limited by the simplistic assumption of radial symmetry. Methods: We find that a two-dimensional (2D) matrix correction, applied to each calibrated image, accounts for off-axis prediction errors in all regions of the detecting surface, including those still problematic after the radial correction is performed. The correction matrix is calculated by quantitative comparison of predicted and measured images that span the entire detecting surface. The correction matrix was verified for dose-linearity, and its effectiveness was verified on a number of test fields. The 2D correction was employed to retrospectively examine 22 off-axis, asymmetric electronic-compensation breast fields, five intensity-modulated brain fields (moderate-high modulation) manipulated for far off-axis delivery, and 29 intensity-modulated clinical fields of varying complexity in the central portion of the detecting surface. Results: Employing the matrix correction to the off-axis test fields and clinical fields, predicted vs measured portal dose agreement improves by up to 15%, producing up to 10% better agreement than the radial correction in some areas of the detecting surface. Gamma evaluation analyses (3 mm, 3% global, 10% dose threshold) of predicted vs measured portal dose images demonstrate pass rate improvement of up to 75% with the matrix correction, producing pass rates that are up to 30% higher than those resulting from the radial correction technique alone. As

  1. Predictive validity of the comprehensive basic science examination mean score for assessment of medical students' performance

    Firouz Behboudi

    2002-04-01

    Full Text Available Background Medical education curriculum improvements can be achieved bye valuating students performance. Medical students have to pass two undergraduate comprehensive examinations, basic science and preinternship, in Iran. Purpose To measure validity of the students' mean score in comprehensive basic science exam (CBSE for predicting their performance in later curriculum phases. Methods This descriptive cross-sectional study was conducted on 95 (38 women and 55 men Guilan medical university students. Their admission to the university was 81% by regional quota and 12% by shaheed and other organizations' share. They first enrolled in 1994 and were able to pass CBS£ at first try. Data on gender, regional quota, and average grades of CBS£, PC, and CPIE were collected by a questionnaire. The calculations were done by SPSS package. Results The correlation coefficient between CBS£ and CPIE mean scores (0.65 was higher than correlation coefficient between CBS£ and PC mean scores (0.49. The predictive validity of CBS£ average grade was significant for students' performance in CPIE; however, the predictive validity of CBSE mean scores for students I pe1jormance in PC was lower. Conclusion he students' mean score in CBSE can be a good denominator for their further admission. We recommend further research to assess the predictive validity for each one of the basic courses. Keywords predictive validity, comprehensive basic exam

  2. Correction

    2002-01-01

    Tile Calorimeter modules stored at CERN. The larger modules belong to the Barrel, whereas the smaller ones are for the two Extended Barrels. (The article was about the completion of the 64 modules for one of the latter.) The photo on the first page of the Bulletin n°26/2002, from 24 July 2002, illustrating the article «The ATLAS Tile Calorimeter gets into shape» was published with a wrong caption. We would like to apologise for this mistake and so publish it again with the correct caption.

  3. The Role of Basic Needs Fulfillment in Prediction of Subjective Well-Being among University Students

    Turkdogan, Turgut; Duru, Erdinc

    2012-01-01

    The aim of this study is to examine the role of fulfillment level of university students' basic needs in predicting the level of their subjective well being. The participants were 627 students (56% female, 44% male) attending different faculties of Pamukkale University. In this study, subjective well being was measured with Life Satisfaction Scale…

  4. A First-order Prediction-Correction Algorithm for Time-varying (Constrained) Optimization: Preprint

    Dall-Anese, Emiliano [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simonetto, Andrea [Universite catholique de Louvain

    2017-07-25

    This paper focuses on the design of online algorithms based on prediction-correction steps to track the optimal solution of a time-varying constrained problem. Existing prediction-correction methods have been shown to work well for unconstrained convex problems and for settings where obtaining the inverse of the Hessian of the cost function can be computationally affordable. The prediction-correction algorithm proposed in this paper addresses the limitations of existing methods by tackling constrained problems and by designing a first-order prediction step that relies on the Hessian of the cost function (and do not require the computation of its inverse). Analytical results are established to quantify the tracking error. Numerical simulations corroborate the analytical results and showcase performance and benefits of the algorithms.

  5. Correction

    2012-01-01

    Full Text Available Regarding Gorelik, G., & Shackelford, T.K. (2011. Human sexual conflict from molecules to culture. Evolutionary Psychology, 9, 564–587: The authors wish to correct an omission in citation to the existing literature. In the final paragraph on p. 570, we neglected to cite Burch and Gallup (2006 [Burch, R. L., & Gallup, G. G., Jr. (2006. The psychobiology of human semen. In S. M. Platek & T. K. Shackelford (Eds., Female infidelity and paternal uncertainty (pp. 141–172. New York: Cambridge University Press.]. Burch and Gallup (2006 reviewed the relevant literature on FSH and LH discussed in this paragraph, and should have been cited accordingly. In addition, Burch and Gallup (2006 should have been cited as the originators of the hypothesis regarding the role of FSH and LH in the semen of rapists. The authors apologize for this oversight.

  6. Correction

    2002-01-01

    The photo on the second page of the Bulletin n°48/2002, from 25 November 2002, illustrating the article «Spanish Visit to CERN» was published with a wrong caption. We would like to apologise for this mistake and so publish it again with the correct caption.   The Spanish delegation, accompanied by Spanish scientists at CERN, also visited the LHC superconducting magnet test hall (photo). From left to right: Felix Rodriguez Mateos of CERN LHC Division, Josep Piqué i Camps, Spanish Minister of Science and Technology, César Dopazo, Director-General of CIEMAT (Spanish Research Centre for Energy, Environment and Technology), Juan Antonio Rubio, ETT Division Leader at CERN, Manuel Aguilar-Benitez, Spanish Delegate to Council, Manuel Delfino, IT Division Leader at CERN, and Gonzalo León, Secretary-General of Scientific Policy to the Minister.

  7. Correction

    2014-01-01

    Full Text Available Regarding Tagler, M. J., and Jeffers, H. M. (2013. Sex differences in attitudes toward partner infidelity. Evolutionary Psychology, 11, 821–832: The authors wish to correct values in the originally published manuscript. Specifically, incorrect 95% confidence intervals around the Cohen's d values were reported on page 826 of the manuscript where we reported the within-sex simple effects for the significant Participant Sex × Infidelity Type interaction (first paragraph, and for attitudes toward partner infidelity (second paragraph. Corrected values are presented in bold below. The authors would like to thank Dr. Bernard Beins at Ithaca College for bringing these errors to our attention. Men rated sexual infidelity significantly more distressing (M = 4.69, SD = 0.74 than they rated emotional infidelity (M = 4.32, SD = 0.92, F(1, 322 = 23.96, p < .001, d = 0.44, 95% CI [0.23, 0.65], but there was little difference between women's ratings of sexual (M = 4.80, SD = 0.48 and emotional infidelity (M = 4.76, SD = 0.57, F(1, 322 = 0.48, p = .29, d = 0.08, 95% CI [−0.10, 0.26]. As expected, men rated sexual infidelity (M = 1.44, SD = 0.70 more negatively than they rated emotional infidelity (M = 2.66, SD = 1.37, F(1, 322 = 120.00, p < .001, d = 1.12, 95% CI [0.85, 1.39]. Although women also rated sexual infidelity (M = 1.40, SD = 0.62 more negatively than they rated emotional infidelity (M = 2.09, SD = 1.10, this difference was not as large and thus in the evolutionary theory supportive direction, F(1, 322 = 72.03, p < .001, d = 0.77, 95% CI [0.60, 0.94].

  8. Haptic Data Processing for Teleoperation Systems: Prediction, Compression and Error Correction

    Lee, Jae-young

    2013-01-01

    This thesis explores haptic data processing methods for teleoperation systems, including prediction, compression, and error correction. In the proposed haptic data prediction method, unreliable network conditions, such as time-varying delay and packet loss, are detected by a transport layer protocol. Given the information from the transport layer, a Bayesian approach is introduced to predict position and force data in haptic teleoperation systems. Stability of the proposed method within stoch...

  9. Correction.

    2015-06-01

    Gillon R. Defending the four principles approach as a good basis for good medical practice and therefore for good medical ethics. J Med Ethics 2015;41:111–6. The author misrepresented Beauchamp and Childress when he wrote: ‘My own view (unlike Beauchamp and Childress who explicitly state that they make no such claim ( p. 421)1, is that all moral agents whether or not they are doctors or otherwise involved in healthcare have these prima facie moral obligations; but in the context of answering the question ‘what is it to do good medical ethics ?’ my claim is limited to the ethical obligations of doctors’. The author intended and should have written the following: ‘My own view, unlike Beauchamp and Childress who explicitly state that they make no such claim (p.421)1 is that these four prima facie principles can provide a basic moral framework not only for medical ethics but for ethics in general’.

  10. Intraindividual Variability in Basic Reaction Time Predicts Middle-Aged and Older Pilots’ Flight Simulator Performance

    2013-01-01

    Objectives. Intraindividual variability (IIV) is negatively associated with cognitive test performance and is positively associated with age and some neurological disorders. We aimed to extend these findings to a real-world task, flight simulator performance. We hypothesized that IIV predicts poorer initial flight performance and increased rate of decline in performance among middle-aged and older pilots. Method. Two-hundred and thirty-six pilots (40–69 years) completed annual assessments comprising a cognitive battery and two 75-min simulated flights in a flight simulator. Basic and complex IIV composite variables were created from measures of basic reaction time and shifting and divided attention tasks. Flight simulator performance was characterized by an overall summary score and scores on communication, emergencies, approach, and traffic avoidance components. Results. Although basic IIV did not predict rate of decline in flight performance, it had a negative association with initial performance for most flight measures. After taking into account processing speed, basic IIV explained an additional 8%–12% of the negative age effect on initial flight performance. Discussion. IIV plays an important role in real-world tasks and is another aspect of cognition that underlies age-related differences in cognitive performance. PMID:23052365

  11. Intraindividual variability in basic reaction time predicts middle-aged and older pilots' flight simulator performance.

    Kennedy, Quinn; Taylor, Joy; Heraldez, Daniel; Noda, Art; Lazzeroni, Laura C; Yesavage, Jerome

    2013-07-01

    Intraindividual variability (IIV) is negatively associated with cognitive test performance and is positively associated with age and some neurological disorders. We aimed to extend these findings to a real-world task, flight simulator performance. We hypothesized that IIV predicts poorer initial flight performance and increased rate of decline in performance among middle-aged and older pilots. Two-hundred and thirty-six pilots (40-69 years) completed annual assessments comprising a cognitive battery and two 75-min simulated flights in a flight simulator. Basic and complex IIV composite variables were created from measures of basic reaction time and shifting and divided attention tasks. Flight simulator performance was characterized by an overall summary score and scores on communication, emergencies, approach, and traffic avoidance components. Although basic IIV did not predict rate of decline in flight performance, it had a negative association with initial performance for most flight measures. After taking into account processing speed, basic IIV explained an additional 8%-12% of the negative age effect on initial flight performance. IIV plays an important role in real-world tasks and is another aspect of cognition that underlies age-related differences in cognitive performance.

  12. Basic traits predict the prevalence of personality disorder across the life span: the example of psychopathy.

    Vachon, David D; Lynam, Donald R; Widiger, Thomas A; Miller, Joshua D; McCrae, Robert R; Costa, Paul T

    2013-05-01

    Personality disorders (PDs) may be better understood in terms of dimensions of general personality functioning rather than as discrete categorical conditions. Personality-trait descriptions of PDs are robust across methods and settings, and PD assessments based on trait measures show good construct validity. The study reported here extends research showing that basic traits (e.g., impulsiveness, warmth, straightforwardness, modesty, and deliberation) can re-create the epidemiological characteristics associated with PDs. Specifically, we used normative changes in absolute trait levels to simulate age-related differences in the prevalence of psychopathy in a forensic setting. Results demonstrated that trait information predicts the rate of decline for psychopathy over the life span; discriminates the decline of psychopathy from that of a similar disorder, antisocial PD; and accurately predicts the differential decline of subfactors of psychopathy. These findings suggest that basic traits provide a parsimonious account of PD prevalence across the life span.

  13. Evaluation of multiple protein docking structures using correctly predicted pairwise subunits

    Esquivel-Rodríguez Juan

    2012-03-01

    Full Text Available Abstract Background Many functionally important proteins in a cell form complexes with multiple chains. Therefore, computational prediction of multiple protein complexes is an important task in bioinformatics. In the development of multiple protein docking methods, it is important to establish a metric for evaluating prediction results in a reasonable and practical fashion. However, since there are only few works done in developing methods for multiple protein docking, there is no study that investigates how accurate structural models of multiple protein complexes should be to allow scientists to gain biological insights. Methods We generated a series of predicted models (decoys of various accuracies by our multiple protein docking pipeline, Multi-LZerD, for three multi-chain complexes with 3, 4, and 6 chains. We analyzed the decoys in terms of the number of correctly predicted pair conformations in the decoys. Results and conclusion We found that pairs of chains with the correct mutual orientation exist even in the decoys with a large overall root mean square deviation (RMSD to the native. Therefore, in addition to a global structure similarity measure, such as the global RMSD, the quality of models for multiple chain complexes can be better evaluated by using the local measurement, the number of chain pairs with correct mutual orientation. We termed the fraction of correctly predicted pairs (RMSD at the interface of less than 4.0Å as fpair and propose to use it for evaluation of the accuracy of multiple protein docking.

  14. Basic model for the prediction of 137Cs concentration in the organisms of detritus food chain

    Tateda, Yuzuru

    1997-01-01

    In order to predict 137 Cs concentrations in marine organisms for monitoring, a basic model for the prediction of nuclide levels in marine organisms of detritus food chain was studied. The equilibrated values of ( 137 Cs level in organism)/( 137 Cs level in seawater) derived from calculation agreed with the observed data, indicating validity of modeling conditions. The result of simulation by this basic model showed the following conclusions. 1) ''Ecological half-life'' of 137 Cs in organisms of food chain were 35 and 130 days for detritus feeder and benthic teleosts, respectively, indicating that there was no difference of the ecological half lives in organisms between in detritus food chain and in other food chains. 2) The 137 Cs concentration in organisms showed a peak at 18 and 100 days in detritus and detritus feeder, respectively, after the introduction of 137 Cs into environmental seawater. Their concentration ratios to 137 Cs peak level in seawater were within a range of 2.7-3.8, indicating insignificant difference in the response to 137 Cs change in seawater between in the organisms of detritus food chain and of other food chain. 3) The basic model studies makes it available that the prediction of 137 Cs level in organisms of food chain can be simulated in coastal ecosystem. (author)

  15. Incorporation of lysosomal sequestration in the mechanistic model for prediction of tissue distribution of basic drugs.

    Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra

    2017-11-15

    The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among

  16. A Combination of Terrain Prediction and Correction for Search and Rescue Robot Autonomous Navigation

    Yan Guo

    2009-09-01

    Full Text Available This paper presents a novel two-step autonomous navigation method for search and rescue robot. The algorithm based on the vision is proposed for terrain identification to give a prediction of the safest path with the support vector regression machine (SVRM trained off-line with the texture feature and color features. And correction algorithm of the prediction based the vibration information is developed during the robot traveling, using the judgment function given in the paper. The region with fault prediction will be corrected with the real traversability value and be used to update the SVRM. The experiment demonstrates that this method could help the robot to find the optimal path and be protected from the trap brought from the error between prediction and the real environment.

  17. Robust recurrent neural network modeling for software fault detection and correction prediction

    Hu, Q.P.; Xie, M.; Ng, S.H.; Levitin, G.

    2007-01-01

    Software fault detection and correction processes are related although different, and they should be studied together. A practical approach is to apply software reliability growth models to model fault detection, and fault correction process is assumed to be a delayed process. On the other hand, the artificial neural networks model, as a data-driven approach, tries to model these two processes together with no assumptions. Specifically, feedforward backpropagation networks have shown their advantages over analytical models in fault number predictions. In this paper, the following approach is explored. First, recurrent neural networks are applied to model these two processes together. Within this framework, a systematic networks configuration approach is developed with genetic algorithm according to the prediction performance. In order to provide robust predictions, an extra factor characterizing the dispersion of prediction repetitions is incorporated into the performance function. Comparisons with feedforward neural networks and analytical models are developed with respect to a real data set

  18. Using individual differences to predict job performance: correcting for direct and indirect restriction of range.

    Sjöberg, Sofia; Sjöberg, Anders; Näswall, Katharina; Sverke, Magnus

    2012-08-01

    The present study investigates the relationship between individual differences, indicated by personality (FFM) and general mental ability (GMA), and job performance applying two different methods of correction for range restriction. The results, derived by analyzing meta-analytic correlations, show that the more accurate method of correcting for indirect range restriction increased the operational validity of individual differences in predicting job performance and that this increase primarily was due to general mental ability being a stronger predictor than any of the personality traits. The estimates for single traits can be applied in practice to maximize prediction of job performance. Further, differences in the relative importance of general mental ability in relation to overall personality assessment methods was substantive and the estimates provided enables practitioners to perform a correct utility analysis of their overall selection procedure. © 2012 The Authors. Scandinavian Journal of Psychology © 2012 The Scandinavian Psychological Associations.

  19. Correction for Measurement Error from Genotyping-by-Sequencing in Genomic Variance and Genomic Prediction Models

    Ashraf, Bilal; Janss, Luc; Jensen, Just

    sample). The GBSeq data can be used directly in genomic models in the form of individual SNP allele-frequency estimates (e.g., reference reads/total reads per polymorphic site per individual), but is subject to measurement error due to the low sequencing depth per individual. Due to technical reasons....... In the current work we show how the correction for measurement error in GBSeq can also be applied in whole genome genomic variance and genomic prediction models. Bayesian whole-genome random regression models are proposed to allow implementation of large-scale SNP-based models with a per-SNP correction...... for measurement error. We show correct retrieval of genomic explained variance, and improved genomic prediction when accounting for the measurement error in GBSeq data...

  20. Heisenberg coupling constant predicted for molecular magnets with pairwise spin-contamination correction

    Masunov, Artëm E., E-mail: amasunov@ucf.edu [NanoScience Technology Center, Department of Chemistry, and Department of Physics, University of Central Florida, Orlando, FL 32826 (United States); Photochemistry Center RAS, ul. Novatorov 7a, Moscow 119421 (Russian Federation); Gangopadhyay, Shruba [Department of Physics, University of California, Davis, CA 95616 (United States); IBM Almaden Research Center, 650 Harry Road, San Jose, CA 95120 (United States)

    2015-12-15

    New method to eliminate the spin-contamination in broken symmetry density functional theory (BS DFT) calculations is introduced. Unlike conventional spin-purification correction, this method is based on canonical Natural Orbitals (NO) for each high/low spin coupled electron pair. We derive an expression to extract the energy of the pure singlet state given in terms of energy of BS DFT solution, the occupation number of the bonding NO, and the energy of the higher spin state built on these bonding and antibonding NOs (not self-consistent Kohn–Sham orbitals of the high spin state). Compared to the other spin-contamination correction schemes, spin-correction is applied to each correlated electron pair individually. We investigate two binuclear Mn(IV) molecular magnets using this pairwise correction. While one of the molecules is described by magnetic orbitals strongly localized on the metal centers, and spin gap is accurately predicted by Noodleman and Yamaguchi schemes, for the other one the gap is predicted poorly by these schemes due to strong delocalization of the magnetic orbitals onto the ligands. We show our new correction to yield more accurate results in both cases. - Highlights: • Magnetic orbitails obtained for high and low spin states are not related. • Spin-purification correction becomes inaccurate for delocalized magnetic orbitals. • We use the natural orbitals of the broken symmetry state to build high spin state. • This new correction is made separately for each electron pair. • Our spin-purification correction is more accurate for delocalised magnetic orbitals.

  1. A Class of Prediction-Correction Methods for Time-Varying Convex Optimization

    Simonetto, Andrea; Mokhtari, Aryan; Koppel, Alec; Leus, Geert; Ribeiro, Alejandro

    2016-09-01

    This paper considers unconstrained convex optimization problems with time-varying objective functions. We propose algorithms with a discrete time-sampling scheme to find and track the solution trajectory based on prediction and correction steps, while sampling the problem data at a constant rate of $1/h$, where $h$ is the length of the sampling interval. The prediction step is derived by analyzing the iso-residual dynamics of the optimality conditions. The correction step adjusts for the distance between the current prediction and the optimizer at each time step, and consists either of one or multiple gradient steps or Newton steps, which respectively correspond to the gradient trajectory tracking (GTT) or Newton trajectory tracking (NTT) algorithms. Under suitable conditions, we establish that the asymptotic error incurred by both proposed methods behaves as $O(h^2)$, and in some cases as $O(h^4)$, which outperforms the state-of-the-art error bound of $O(h)$ for correction-only methods in the gradient-correction step. Moreover, when the characteristics of the objective function variation are not available, we propose approximate gradient and Newton tracking algorithms (AGT and ANT, respectively) that still attain these asymptotical error bounds. Numerical simulations demonstrate the practical utility of the proposed methods and that they improve upon existing techniques by several orders of magnitude.

  2. Color Fringe Correction by the Color Difference Prediction Using the Logistic Function.

    Jang, Dong-Won; Park, Rae-Hong

    2017-05-01

    This paper proposes a new color fringe correction method that preserves the object color well by the color difference prediction using the logistic function. We observe two characteristics between normal edge (NE) and degraded edge (DE) due to color fringe: 1) the DE has relatively smaller R-G and B-G correlations than the NE and 2) the color difference in the NE can be fitted by the logistic function. The proposed method adjusts the color difference of the DE to the logistic function by maximizing the R-G and B-G correlations in the corrected color fringe image. The generalized logistic function with four parameters requires a high computational load to select the optimal parameters. In experiments, a one-parameter optimization can correct color fringe gracefully with a reduced computational load. Experimental results show that the proposed method restores well the original object color in the DE, whereas existing methods give monochromatic or distorted color.

  3. Dispersion corrected hartree-fock and density functional theory for organic crystal structure prediction.

    Brandenburg, Jan Gerit; Grimme, Stefan

    2014-01-01

    We present and evaluate dispersion corrected Hartree-Fock (HF) and Density Functional Theory (DFT) based quantum chemical methods for organic crystal structure prediction. The necessity of correcting for missing long-range electron correlation, also known as van der Waals (vdW) interaction, is pointed out and some methodological issues such as inclusion of three-body dispersion terms are discussed. One of the most efficient and widely used methods is the semi-classical dispersion correction D3. Its applicability for the calculation of sublimation energies is investigated for the benchmark set X23 consisting of 23 small organic crystals. For PBE-D3 the mean absolute deviation (MAD) is below the estimated experimental uncertainty of 1.3 kcal/mol. For two larger π-systems, the equilibrium crystal geometry is investigated and very good agreement with experimental data is found. Since these calculations are carried out with huge plane-wave basis sets they are rather time consuming and routinely applicable only to systems with less than about 200 atoms in the unit cell. Aiming at crystal structure prediction, which involves screening of many structures, a pre-sorting with faster methods is mandatory. Small, atom-centered basis sets can speed up the computation significantly but they suffer greatly from basis set errors. We present the recently developed geometrical counterpoise correction gCP. It is a fast semi-empirical method which corrects for most of the inter- and intramolecular basis set superposition error. For HF calculations with nearly minimal basis sets, we additionally correct for short-range basis incompleteness. We combine all three terms in the HF-3c denoted scheme which performs very well for the X23 sublimation energies with an MAD of only 1.5 kcal/mol, which is close to the huge basis set DFT-D3 result.

  4. Critical Test of Some Computational Chemistry Methods for Prediction of Gas-Phase Acidities and Basicities.

    Toomsalu, Eve; Koppel, Ilmar A; Burk, Peeter

    2013-09-10

    Gas-phase acidities and basicities were calculated for 64 neutral bases (covering the scale from 139.9 kcal/mol to 251.9 kcal/mol) and 53 neutral acids (covering the scale from 299.5 kcal/mol to 411.7 kcal/mol). The following methods were used: AM1, PM3, PM6, PDDG, G2, G2MP2, G3, G3MP2, G4, G4MP2, CBS-QB3, B1B95, B2PLYP, B2PLYPD, B3LYP, B3PW91, B97D, B98, BLYP, BMK, BP86, CAM-B3LYP, HSEh1PBE, M06, M062X, M06HF, M06L, mPW2PLYP, mPW2PLYPD, O3LYP, OLYP, PBE1PBE, PBEPBE, tHCTHhyb, TPSSh, VSXC, X3LYP. The addition of the Grimmes empirical dispersion correction (D) to B2PLYP and mPW2PLYP was evaluated, and it was found that adding this correction gave more-accurate results when considering acidities. Calculations with B3LYP, B97D, BLYP, B2PLYPD, and PBE1PBE methods were carried out with five basis sets (6-311G**, 6-311+G**, TZVP, cc-pVTZ, and aug-cc-pVTZ) to evaluate the effect of basis sets on the accuracy of calculations. It was found that the best basis sets when considering accuracy of results and needed time were 6-311+G** and TZVP. Among semiempirical methods AM1 had the best ability to reproduce experimental acidities and basicities (the mean absolute error (mae) was 7.3 kcal/mol). Among DFT methods the best method considering accuracy, robustness, and computation time was PBE1PBE/6-311+G** (mae = 2.7 kcal/mol). Four Gaussian-type methods (G2, G2MP2, G4, and G4MP2) gave similar results to each other (mae = 2.3 kcal/mol). Gaussian-type methods are quite accurate, but their downside is the relatively long computational time.

  5. [Study on correction of data bias caused by different missing mechanisms in survey of medical expenditure among students enrolling in Urban Resident Basic Medical Insurance].

    Zhang, Haixia; Zhao, Junkang; Gu, Caijiao; Cui, Yan; Rong, Huiying; Meng, Fanlong; Wang, Tong

    2015-05-01

    The study of the medical expenditure and its influencing factors among the students enrolling in Urban Resident Basic Medical Insurance (URBMI) in Taiyuan indicated that non response bias and selection bias coexist in dependent variable of the survey data. Unlike previous studies only focused on one missing mechanism, a two-stage method to deal with two missing mechanisms simultaneously was suggested in this study, combining multiple imputation with sample selection model. A total of 1 190 questionnaires were returned by the students (or their parents) selected in child care settings, schools and universities in Taiyuan by stratified cluster random sampling in 2012. In the returned questionnaires, 2.52% existed not missing at random (NMAR) of dependent variable and 7.14% existed missing at random (MAR) of dependent variable. First, multiple imputation was conducted for MAR by using completed data, then sample selection model was used to correct NMAR in multiple imputation, and a multi influencing factor analysis model was established. Based on 1 000 times resampling, the best scheme of filling the random missing values is the predictive mean matching (PMM) method under the missing proportion. With this optimal scheme, a two stage survey was conducted. Finally, it was found that the influencing factors on annual medical expenditure among the students enrolling in URBMI in Taiyuan included population group, annual household gross income, affordability of medical insurance expenditure, chronic disease, seeking medical care in hospital, seeking medical care in community health center or private clinic, hospitalization, hospitalization canceled due to certain reason, self medication and acceptable proportion of self-paid medical expenditure. The two-stage method combining multiple imputation with sample selection model can deal with non response bias and selection bias effectively in dependent variable of the survey data.

  6. Predictive modeling for corrective maintenance of imaging devices from machine logs.

    Patil, Ravindra B; Patil, Meru A; Ravi, Vidya; Naik, Sarif

    2017-07-01

    In the cost sensitive healthcare industry, an unplanned downtime of diagnostic and therapy imaging devices can be a burden on the financials of both the hospitals as well as the original equipment manufacturers (OEMs). In the current era of connectivity, it is easier to get these devices connected to a standard monitoring station. Once the system is connected, OEMs can monitor the health of these devices remotely and take corrective actions by providing preventive maintenance thereby avoiding major unplanned downtime. In this article, we present an overall methodology of predicting failure of these devices well before customer experiences it. We use data-driven approach based on machine learning to predict failures in turn resulting in reduced machine downtime, improved customer satisfaction and cost savings for the OEMs. One of the use-case of predicting component failure of PHILIPS iXR system is explained in this article.

  7. An effective drift correction for dynamical downscaling of decadal global climate predictions

    Paeth, Heiko; Li, Jingmin; Pollinger, Felix; Müller, Wolfgang A.; Pohlmann, Holger; Feldmann, Hendrik; Panitz, Hans-Jürgen

    2018-04-01

    Initialized decadal climate predictions with coupled climate models are often marked by substantial climate drifts that emanate from a mismatch between the climatology of the coupled model system and the data set used for initialization. While such drifts may be easily removed from the prediction system when analyzing individual variables, a major problem prevails for multivariate issues and, especially, when the output of the global prediction system shall be used for dynamical downscaling. In this study, we present a statistical approach to remove climate drifts in a multivariate context and demonstrate the effect of this drift correction on regional climate model simulations over the Euro-Atlantic sector. The statistical approach is based on an empirical orthogonal function (EOF) analysis adapted to a very large data matrix. The climate drift emerges as a dramatic cooling trend in North Atlantic sea surface temperatures (SSTs) and is captured by the leading EOF of the multivariate output from the global prediction system, accounting for 7.7% of total variability. The SST cooling pattern also imposes drifts in various atmospheric variables and levels. The removal of the first EOF effectuates the drift correction while retaining other components of intra-annual, inter-annual and decadal variability. In the regional climate model, the multivariate drift correction of the input data removes the cooling trends in most western European land regions and systematically reduces the discrepancy between the output of the regional climate model and observational data. In contrast, removing the drift only in the SST field from the global model has hardly any positive effect on the regional climate model.

  8. Do abundance distributions and species aggregation correctly predict macroecological biodiversity patterns in tropical forests?

    Wiegand, Thorsten; Lehmann, Sebastian; Huth, Andreas; Fortin, Marie‐Josée

    2016-01-01

    Abstract Aim It has been recently suggested that different ‘unified theories of biodiversity and biogeography’ can be characterized by three common ‘minimal sufficient rules’: (1) species abundance distributions follow a hollow curve, (2) species show intraspecific aggregation, and (3) species are independently placed with respect to other species. Here, we translate these qualitative rules into a quantitative framework and assess if these minimal rules are indeed sufficient to predict multiple macroecological biodiversity patterns simultaneously. Location Tropical forest plots in Barro Colorado Island (BCI), Panama, and in Sinharaja, Sri Lanka. Methods We assess the predictive power of the three rules using dynamic and spatial simulation models in combination with census data from the two forest plots. We use two different versions of the model: (1) a neutral model and (2) an extended model that allowed for species differences in dispersal distances. In a first step we derive model parameterizations that correctly represent the three minimal rules (i.e. the model quantitatively matches the observed species abundance distribution and the distribution of intraspecific aggregation). In a second step we applied the parameterized models to predict four additional spatial biodiversity patterns. Results Species‐specific dispersal was needed to quantitatively fulfil the three minimal rules. The model with species‐specific dispersal correctly predicted the species–area relationship, but failed to predict the distance decay, the relationship between species abundances and aggregations, and the distribution of a spatial co‐occurrence index of all abundant species pairs. These results were consistent over the two forest plots. Main conclusions The three ‘minimal sufficient’ rules only provide an incomplete approximation of the stochastic spatial geometry of biodiversity in tropical forests. The assumption of independent interspecific placements is most

  9. Fundamental studies of aluminum corrosion in acidic and basic environments: Theoretical predictions and experimental observations

    Lashgari, Mohsen; Malek, Ali M.

    2010-01-01

    Using quantum electrochemical approaches based on density functional theory and cluster/polarized continuum model, we investigated the corrosion behavior of aluminum in HCl and NaOH media containing phenol inhibitor. In this regard, we determined the geometry and electronic structure of the species at metal/solution interface. The investigations revealed that the interaction energies of hydroxide corrosive agents with aluminum surface should be more negative than those of chloride ones. The inhibitor adsorption in acid is more likely to have a physical nature while it appears as though to be chemical in basic media. To verify these predictions, using Tafel plots, we studied the phenomena from experimental viewpoint. The studies confirmed that the rate of corrosion in alkaline solution is substantially greater than in HCl media. Moreover, phenol is a potential-molecule having mixed-type inhibition mechanism. The relationship between inhibitory action and molecular parameters was discussed and the activity in alkaline media was also theoretically anticipated. This prediction was in accord with experiment.

  10. Basic tuning of hydrogen powered car and artificial intelligent prediction of hydrogen engine characteristics

    Ho, Tien [School of Engineering, University of Tasmania, GPO Box 252-65, Hobart, Tasmania, 7001 (Australia); Karri, Vishy [Australian College of Kuwait, P.O. Box 1411, Safat 13015 (Kuwait)

    2010-09-15

    Many studies of renewable energy have shown hydrogen is one of the major green energy in the future. This has lead to the development of many automotive application of using hydrogen as a fuel especially in internal combustion engine. Nonetheless, there has been a slow growth and less knowledge details in building up the prototype and control methodology of the hydrogen internal combustion engine. In this paper, The Toyota Corolla 4 cylinder, 1.8l engine running on petrol was systematically modified in such a way that it could be operated on either gasoline or hydrogen at the choice of the driver. Within the scope of this project, several ancillary instruments such as a new inlet manifold, hydrogen fuel injection, storage system and leak detection safety system were implemented. Attention is directed towards special characteristics related to the basic tuning of hydrogen engine such as: air to fuel ratio operating conditions, ignition timing and injection timing in terms of different engine speed and throttle position. Based on the experimental data, a suite of neural network models were tested to accurately predict the effect of different engine operating conditions (speed and throttle position) on the hydrogen powered car engine characteristics. Predictions were found to be {+-}3% to the experimental values for all of case studies. This work provided better understanding of the effect of hydrogen engine characteristic parameters on different engine operating conditions. (author)

  11. Search performance is better predicted by tileability than presence of a unique basic feature

    Chang, Honghua; Rosenholtz, Ruth

    2016-01-01

    Traditional models of visual search such as feature integration theory (FIT; Treisman & Gelade, 1980), have suggested that a key factor determining task difficulty consists of whether or not the search target contains a “basic feature” not found in the other display items (distractors). Here we discriminate between such traditional models and our recent texture tiling model (TTM) of search (Rosenholtz, Huang, Raj, Balas, & Ilie, 2012b), by designing new experiments that directly pit these models against each other. Doing so is nontrivial, for two reasons. First, the visual representation in TTM is fully specified, and makes clear testable predictions, but its complexity makes getting intuitions difficult. Here we elucidate a rule of thumb for TTM, which enables us to easily design new and interesting search experiments. FIT, on the other hand, is somewhat ill-defined and hard to pin down. To get around this, rather than designing totally new search experiments, we start with five classic experiments that FIT already claims to explain: T among Ls, 2 among 5s, Q among Os, O among Qs, and an orientation/luminance-contrast conjunction search. We find that fairly subtle changes in these search tasks lead to significant changes in performance, in a direction predicted by TTM, providing definitive evidence in favor of the texture tiling model as opposed to traditional views of search. PMID:27548090

  12. Predicting the sparticle spectrum from GUTs via SUSY threshold corrections with SusyTC

    Antusch, Stefan [Department of Physics, University of Basel,Klingelbergstr. 82, CH-4056 Basel (Switzerland); Max-Planck-Institut für Physik (Werner-Heisenberg-Institut),Föhringer Ring 6, D-80805 München (Germany); Sluka, Constantin [Department of Physics, University of Basel,Klingelbergstr. 82, CH-4056 Basel (Switzerland)

    2016-07-21

    Grand Unified Theories (GUTs) can feature predictions for the ratios of quark and lepton Yukawa couplings at high energy, which can be tested with the increasingly precise results for the fermion masses, given at low energies. To perform such tests, the renormalization group (RG) running has to be performed with sufficient accuracy. In supersymmetric (SUSY) theories, the one-loop threshold corrections (TC) are of particular importance and, since they affect the quark-lepton mass relations, link a given GUT flavour model to the sparticle spectrum. To accurately study such predictions, we extend and generalize various formulas in the literature which are needed for a precision analysis of SUSY flavour GUT models. We introduce the new software tool SusyTC, a major extension to the Mathematica package REAP http://dx.doi.org/10.1088/1126-6708/2005/03/024, where these formulas are implemented. SusyTC extends the functionality of REAP by a full inclusion of the (complex) MSSM SUSY sector and a careful calculation of the one-loop SUSY threshold corrections for the full down-type quark, up-type quark and charged lepton Yukawa coupling matrices in the electroweak-unbroken phase. Among other useful features, SusyTC calculates the one-loop corrected pole mass of the charged (or the CP-odd) Higgs boson as well as provides output in SLHA conventions, i.e. the necessary input for external software, e.g. for performing a two-loop Higgs mass calculation. We apply SusyTC to study the predictions for the parameters of the CMSSM (mSUGRA) SUSY scenario from the set of GUT scale Yukawa relations ((y{sub e})/(y{sub d}))=−(1/2), ((y{sub μ})/(y{sub s}))=6, and ((y{sub τ})/(y{sub b}))=−(3/2), which has been proposed recently in the context of SUSY GUT flavour models.

  13. Basic fibroblast growth factor predicts cardiovascular disease occurrence in participants from the Veterans Affairs Diabetes Trial

    Mark B Zimering

    2013-11-01

    Full Text Available Aim: Cardiovascular disease is a leading cause of morbidity and mortality in adults with type 2 diabetes mellitus. The aim of the present study was to test whether plasma basic fibroblast growth factor (bFGF levels predict future cardiovascular disease (CVD occurrence in adults from the Veterans Affairs Diabetes Trial. Methods: Nearly four- hundred veterans, 40 years of age or older, having a mean baseline diabetes duration of 11.4 years were recruited from outpatient clinics at six geographically distributed sites in the Veterans Affairs Diabetes Trial (VADT. Within the VADT, they were randomly assigned to intensive or standard glycemic treatment, with follow-up as much as seven and one-half years. Cardiovascular disease occurrence was examined at baseline in the patient population and during randomized treatment. Plasma bFGF was determined with a sensitive, specific two-site enzyme-linked immunoassay at the baseline study visit in all 399 subjects. Results: One hundred-five first cardiovascular events occurred in these 399 subjects. The best fit model of risk factors associated with the time to first cardiovascular disease occurrence (in the study over a seven and one-half year period had as significant predictors: prior cardiovascular event, (hazard ratio [HR] 3.378; 95% confidence intervals [CI] 3.079- 3.807; P < .0001, baseline plasma bFGF (HR 1.008; 95% CI 1.002-1.014; P =.01, age, (HR 1.027; 95% CI 1.004-1.051; P =.019, baseline plasma triglycerides, (HR 1.001; 95% CI 1.000-1.002; P =.02 and diabetes duration-treatment interaction (P =.03. Intensive glucose-lowering was associated with significantly decreased hazard ratios for CVD occurrence (0.38-0.63 in patients with known diabetes duration of 0-10 years, and non-significantly increased hazard ratios for CVD occurrence (0.82-1.78 in patients with longer diabetes duration. Conclusion: High level ofplasma basic fibroblast growth factor is a predictive biomarker of future cardiovascular

  14. A New Global Regression Analysis Method for the Prediction of Wind Tunnel Model Weight Corrections

    Ulbrich, Norbert Manfred; Bridge, Thomas M.; Amaya, Max A.

    2014-01-01

    A new global regression analysis method is discussed that predicts wind tunnel model weight corrections for strain-gage balance loads during a wind tunnel test. The method determines corrections by combining "wind-on" model attitude measurements with least squares estimates of the model weight and center of gravity coordinates that are obtained from "wind-off" data points. The method treats the least squares fit of the model weight separate from the fit of the center of gravity coordinates. Therefore, it performs two fits of "wind- off" data points and uses the least squares estimator of the model weight as an input for the fit of the center of gravity coordinates. Explicit equations for the least squares estimators of the weight and center of gravity coordinates are derived that simplify the implementation of the method in the data system software of a wind tunnel. In addition, recommendations for sets of "wind-off" data points are made that take typical model support system constraints into account. Explicit equations of the confidence intervals on the model weight and center of gravity coordinates and two different error analyses of the model weight prediction are also discussed in the appendices of the paper.

  15. Characterization, prediction, and correction of geometric distortion in 3 T MR images

    Baldwin, Lesley N.; Wachowicz, Keith; Thomas, Steven D.; Rivest, Ryan; Gino Fallone, B.

    2007-01-01

    The work presented herein describes our methods and results for predicting, measuring and correcting geometric distortions in a 3 T clinical magnetic resonance (MR) scanner for the purpose of image guidance in radiation treatment planning. Geometric inaccuracies due to both inhomogeneities in the background field and nonlinearities in the applied gradients were easily visualized on the MR images of a regularly structured three-dimensional (3D) grid phantom. From a computed tomography scan, the locations of just under 10 000 control points within the phantom were accurately determined in three dimensions using a MATLAB-based computer program. MR distortion was then determined by measuring the corresponding locations of the control points when the phantom was imaged using the MR scanner. Using a reversed gradient method, distortions due to gradient nonlinearities were separated from distortions due to inhomogeneities in the background B 0 field. Because the various sources of machine-related distortions can be individually characterized, distortions present in other imaging sequences (for which 3D distortion cannot accurately be measured using phantom methods) can be predicted negating the need for individual distortion calculation for a variety of other imaging sequences. Distortions were found to be primarily caused by gradient nonlinearities and maximum image distortions were reported to be less than those previously found by other researchers at 1.5 T. Finally, the image slices were corrected for distortion in order to provide geometrically accurate phantom images

  16. Impacts of Earth rotation parameters on GNSS ultra-rapid orbit prediction: Derivation and real-time correction

    Wang, Qianxin; Hu, Chao; Xu, Tianhe; Chang, Guobin; Hernández Moraleda, Alberto

    2017-12-01

    Analysis centers (ACs) for global navigation satellite systems (GNSSs) cannot accurately obtain real-time Earth rotation parameters (ERPs). Thus, the prediction of ultra-rapid orbits in the international terrestrial reference system (ITRS) has to utilize the predicted ERPs issued by the International Earth Rotation and Reference Systems Service (IERS) or the International GNSS Service (IGS). In this study, the accuracy of ERPs predicted by IERS and IGS is analyzed. The error of the ERPs predicted for one day can reach 0.15 mas and 0.053 ms in polar motion and UT1-UTC direction, respectively. Then, the impact of ERP errors on ultra-rapid orbit prediction by GNSS is studied. The methods for orbit integration and frame transformation in orbit prediction with introduced ERP errors dominate the accuracy of the predicted orbit. Experimental results show that the transformation from the geocentric celestial references system (GCRS) to ITRS exerts the strongest effect on the accuracy of the predicted ultra-rapid orbit. To obtain the most accurate predicted ultra-rapid orbit, a corresponding real-time orbit correction method is developed. First, orbits without ERP-related errors are predicted on the basis of ITRS observed part of ultra-rapid orbit for use as reference. Then, the corresponding predicted orbit is transformed from GCRS to ITRS to adjust for the predicted ERPs. Finally, the corrected ERPs with error slopes are re-introduced to correct the predicted orbit in ITRS. To validate the proposed method, three experimental schemes are designed: function extrapolation, simulation experiments, and experiments with predicted ultra-rapid orbits and international GNSS Monitoring and Assessment System (iGMAS) products. Experimental results show that using the proposed correction method with IERS products considerably improved the accuracy of ultra-rapid orbit prediction (except the geosynchronous BeiDou orbits). The accuracy of orbit prediction is enhanced by at least 50

  17. Tax revenue and inflation rate predictions in Banda Aceh using Vector Error Correction Model (VECM)

    Maulia, Eva; Miftahuddin; Sofyan, Hizir

    2018-05-01

    A country has some important parameters to achieve the welfare of the economy, such as tax revenues and inflation. One of the largest revenues of the state budget in Indonesia comes from the tax sector. Besides, the rate of inflation occurring in a country can be used as one measure, to measure economic problems that the country facing. Given the importance of tax revenue and inflation rate control in achieving economic prosperity, it is necessary to analyze the relationship and forecasting tax revenue and inflation rate. VECM (Vector Error Correction Model) was chosen as the method used in this research, because of the data used in the form of multivariate time series data. This study aims to produce a VECM model with optimal lag and to predict the tax revenue and inflation rate of the VECM model. The results show that the best model for data of tax revenue and the inflation rate in Banda Aceh City is VECM with 3rd optimal lag or VECM (3). Of the seven models formed, there is a significant model that is the acceptance model of income tax. The predicted results of tax revenue and the inflation rate in Kota Banda Aceh for the next 6, 12 and 24 periods (months) obtained using VECM (3) are considered valid, since they have a minimum error value compared to other models.

  18. IMPACT OF DIFFERENT TOPOGRAPHIC CORRECTIONS ON PREDICTION ACCURACY OF FOLIAGE PROJECTIVE COVER (FPC IN A TOPOGRAPHICALLY COMPLEX TERRAIN

    S. Ediriweera

    2012-07-01

    Full Text Available Quantitative retrieval of land surface biological parameters (e.g. foliage projective cover [FPC] and Leaf Area Index is crucial for forest management, ecosystem modelling, and global change monitoring applications. Currently, remote sensing is a widely adopted method for rapid estimation of surface biological parameters in a landscape scale. Topographic correction is a necessary pre-processing step in the remote sensing application for topographically complex terrain. Selection of a suitable topographic correction method on remotely sensed spectral information is still an unresolved problem. The purpose of this study is to assess the impact of topographic corrections on the prediction of FPC in hilly terrain using an established regression model. Five established topographic corrections [C, Minnaert, SCS, SCS+C and processing scheme for standardised surface reflectance (PSSSR] were evaluated on Landsat TM5 acquired under low and high sun angles in closed canopied subtropical rainforest and eucalyptus dominated open canopied forest, north-eastern Australia. The effectiveness of methods at normalizing topographic influence, preserving biophysical spectral information, and internal data variability were assessed by statistical analysis and by comparing field collected FPC data. The results of statistical analyses show that SCS+C and PSSSR perform significantly better than other corrections, which were on less overcorrected areas of faintly illuminated slopes. However, the best relationship between FPC and Landsat spectral responses was obtained with the PSSSR by producing the least residual error. The SCS correction method was poor for correction of topographic effect in predicting FPC in topographically complex terrain.

  19. Evaluation of Different Topographic Corrections for Landsat TM Data by Prediction of Foliage Projective Cover (FPC in Topographically Complex Landscapes

    Sisira Ediriweera

    2013-12-01

    Full Text Available The reflected radiance in topographically complex areas is severely affected by variations in topography; thus, topographic correction is considered a necessary pre-processing step when retrieving biophysical variables from these images. We assessed the performance of five topographic corrections: (i C correction (C, (ii Minnaert, (iii Sun Canopy Sensor (SCS, (iv SCS + C and (v the Processing Scheme for Standardised Surface Reflectance (PSSSR on the Landsat-5 Thematic Mapper (TM reflectance in the context of prediction of Foliage Projective Cover (FPC in hilly landscapes in north-eastern Australia. The performance of topographic corrections on the TM reflectance was assessed by (i visual comparison and (ii statistically comparing TM predicted FPC with ground measured FPC and LiDAR (Light Detection and Ranging-derived FPC estimates. In the majority of cases, the PSSSR method performed best in terms of eliminating topographic effects, providing the best relationship and lowest residual error when comparing ground measured FPC and LiDAR FPC with TM predicted FPC. The Minnaert, C and SCS + C showed the poorest performance. Finally, the use of TM surface reflectance, which includes atmospheric correction and broad Bidirectional Reflectance Distribution Function (BRDF effects, seemed to account for most topographic variation when predicting biophysical variables, such as FPC.

  20. Safety prediction for basic components of safety critical software based on static testing

    Son, H.S.; Seong, P.H.

    2001-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, both of which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  1. Safety prediction for basic components of safety-critical software based on static testing

    Son, H.S.; Seong, P.H.

    2000-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  2. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  3. Prediction of d^0 magnetism in self-interaction corrected density functional theory

    Das Pemmaraju, Chaitanya

    2010-03-01

    Over the past couple of years, the phenomenon of ``d^0 magnetism'' has greatly intrigued the magnetism community [1]. Unlike conventional magnetic materials, ``d^0 magnets'' lack any magnetic ions with open d or f shells but surprisingly, exhibit signatures of ferromagnetism often with a Curie temperature exceeding 300 K. Current research in the field is geared towards trying to understand the mechanism underlying this observed ferromagnetism which is difficult to explain within the conventional m-J paradigm [1]. The most widely studied class of d^0 materials are un-doped and light element doped wide gap Oxides such as HfO2, MgO, ZnO, TiO2 all of which have been put forward as possible d0 ferromagnets. General experimental trends suggest that the magnetism is a feature of highly defective samples leading to the expectation that the phenomenon must be defect related. In particular, based on density functional theory (DFT) calculations acceptor defects formed from the O-2p states in these Oxides have been proposed as being responsible for the ferromagnetism [2,3]. However. predicting magnetism originating from 2p orbitals is a delicate problem, which depends on the subtle interplay between covalency and Hund's coupling. DFT calculations based on semi-local functionals such as the local spin-density approximation (LSDA) can lead to qualitative failures on several fronts. On one hand the excessive delocalization of spin-polarized holes leads to half-metallic ground states and the expectation of room-temperature ferromagnetism. On the other hand, in some cases a magnetic ground state may not be predicted at all as the Hund's coupling might be under estimated. Furthermore, polaronic distortions which are often a feature of acceptor defects in Oxides are not predicted [4,5]. In this presentation, we argue that the self interaction error (SIE) inherent to semi-local functionals is responsible for the failures of LSDA and demonstrate through various examples that beyond

  4. Can Mission Predict School Performance? The Case of Basic Education in Oman

    Al-Ani, Wajeha Thabit; Ismail, Omer Hashim

    2015-01-01

    This article reports on a study that examined the relationship between the mission statements and performance of Basic Education Schools in Oman. The process of mission statement framing was also investigated. A sample of 161 school mission statements was randomly collected from the Ministry of Education school mission portal database representing…

  5. The Utility of the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) in Predicting Reading Achievement

    Echols, Julie M. Young

    2010-01-01

    Reading proficiency is the goal of many local and national reading initiatives. A key component of these initiatives is accurate and reliable reading assessment. In this high-stakes testing arena, the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) has emerged as a preferred measure for identification of students at risk for reading…

  6. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  7. Predictive factors for perioperative blood transfusion in surgeries for correction of idiopathic, neuromuscular or congenital scoliosis

    Alexandre Fogaça Cristante

    2014-12-01

    Full Text Available OBJECTIVE: To evaluate the association of clinical and demographic variables in patients requiring blood transfusion during elective surgery to treat scoliosis with the aim of identifying markers predictive of the need for blood transfusion. METHODS: Based on the review of medical charts at a public university hospital, this retrospective study evaluated whether the following variables were associated with the need for red blood cell transfusion (measured by the number of packs used during scoliosis surgery: scoliotic angle, extent of arthrodesis (number of fused levels, sex of the patient, surgery duration and type of scoliosis (neuromuscular, congenital or idiopathic. RESULTS: Of the 94 patients evaluated in a 55-month period, none required a massive blood transfusion (most patients needed less than two red blood cell packs. The number of packs was not significantly associated with sex or type of scoliosis. The extent of arthrodesis (r = 0.103, surgery duration (r = 0.144 and scoliotic angle (r = 0.004 were weakly correlated with the need for blood transfusion. Linear regression analysis showed an association between the number of spine levels submitted to arthrodesis and the volume of blood used in transfusions (p = 0.001. CONCLUSION: This study did not reveal any evidence of a significant association between the need for red blood cell transfusion and scoliotic angle, sex or surgery duration in scoliosis correction surgery. Submission of more spinal levels to arthrodesis was associated with the use of a greater number of blood packs.

  8. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI: Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Andrea Nechtelberger

    2017-11-01

    Full Text Available The United Nations Academic Impact (UNAI Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7. Starting from the assumptions of Moral Foundations Theory (MFT, we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  9. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  10. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  11. Prediction of Basic Math Course Failure Rate in the Physics, Meteorology, Mathematics, Actuarial Sciences and Pharmacy Degree Programs

    Luis Rojas-Torres

    2014-09-01

    Full Text Available This paper summarizes a study conducted in 2013 with the purpose of predicting the failure rate of math courses taken by Pharmacy, Mathematics, Actuarial Science, Physics and Meteorology students at Universidad de Costa Rica (UCR. Using the Logistics Regression statistical techniques applied to the 2010 cohort, failure rates were predicted of students in the aforementioned programs in one of their Math introductory courses (Calculus 101 for Physics and Meteorology, Math Principles for Mathematics and Actuarial Science and Applied Differential Equations for Pharmacy. For these models, the UCR admission average, the student’s genre, and the average correct answers in the Quantitative Skills Test were used as predictor variables. The most important variable for all models was the Quantitative Skills Test, and the model with the highest correct classification rate was the Logistics Regression. For the estimated Physics-Meteorology, Pharmacy and Mathematics-Actuarial Science models, correct classifications were 89.8%, 73.6%, and 93.9%, respectively.

  12. Megavoltage photon beam attenuation by carbon fiber couch tops and its prediction using correction factors

    Hayashi, Naoki; Shibamoto, Yuta; Obata, Yasunori; Kimura, Takashi; Nakazawa, Hisato; Hagiwara, Masahiro; Hashizume, Chisa I.; Mori, Yoshimasa; Kobayashi, Tatsuya

    2010-01-01

    The purpose of this study was to evaluate the effect of megavoltage photon beam attenuation (PBA) by couch tops and to propose a method for correction of PBA. Four series of phantom measurements were carried out. First, PBA by the exact couch top (ECT, Varian) and Imaging Couch Top (ICT, BrainLAB) was evaluated using a water-equivalent phantom. Second, PBA by Type-S system (Med-Tec), ECT and ICT was compared with a spherical phantom. Third, percentage depth dose (PDD) after passing through ICT was measured to compare with control data of PDD. Forth, the gantry angle dependency of PBA by ICT was evaluated. Then, an equation for PBA correction was elaborated and correction factors for PBA at isocenter were obtained. Finally, this method was applied to a patient with hepatoma. PBA of perpendicular beams by ICT was 4.7% on average. With the increase in field size, the measured values became higher. PBA by ICT was greater than that by Type-S system and ECT. PBA increased significantly as the angle of incidence increased, ranging from 4.3% at 180 deg to 11.2% at 120 deg. Calculated doses obtained by the equation and correction factors agreed quite well with the measured doses between 120 deg and 180 deg of angles of incidence. Also in the patient, PBA by ICT was corrected quite well by the equation and correction factors. In conclusion, PBA and its gantry angle dependency by ICT were observed. This simple method using the equation and correction factors appeared useful to correct the isocenter dose when the PBA effect cannot be corrected by a treatment planning system. (author)

  13. Precise predictions of higgs boson decays including the full one-loop corrections in supersymmetry

    Frisch, W.

    2011-01-01

    The Standard Model of elementary particle physics is a highly successful theory, describing the electromagnetic, strong and weak interaction of matter particles up to energy scales to a few hundred giga electronvolt. Despite its great success in explaining experimental results correctly, there is hardly no doubt that the SM is an effective theory, which means that the theory loses its predictability at higher energies. Therefore, the Standard Model has to be extended in a proper way to describe physics at higher energies. A most promising concept for the extension of the SM is those of Supersymmetry, where for each particle of the SM one or more superpartner particles are introduced. The simplest and most attractive extension of the SM is called Minimal Supersymmetric Standard Model (MSSM). Minimal refers to the additional field content, which is kept as low as possible. In fact the MSSM consists of the fields of the SM and their corresponding supersymmetric partner fields, as well as one additional Higgs doublet. The presence of this additional Higgs doublet leads to the existence of five physical Higgs bosons in the MSSM. The search for supersymmetric particles and Higgs bosons is one of the primary goals of the Large Hadron Collider (LHC) at the CERN laboratory, producing collisions at sufficiently high energies to detect these particles. For the discovery of these new particles, precise pre- dictions of the corresponding decay widths and branching rations are utmost mandatory. To contribute with the precision of the LHC and the future ILC, Feynman amplitudes should be calculated at least to one-loop order. Since these calculations lead to so called UV- and IR- divergences, it is essential to perform a renormalization procedure, where the divergences are subtracted by a proper definition of counterterms. The goal of this work was to develop a program package, which calculates all MSSM two- body Higgs decay widths and corresponding branching ratios at full one

  14. A predictive tool for selective oxidation of hydrocarbons: optical basicity of catalysts

    Moriceau, P.; Lebouteiller, A.; Bordes, E.; Courtine, P. [Universite de Technologie de Compiegne, 60 (France). Dept. de Genie Chimique

    1998-12-31

    Whatever the composition of the catalyst (promoted, supported, multicomponent, etc.) is, it is possible to calculate its electron donor capacity {Lambda}. However, one important question remains: How are the surface and the bulk values of {Lambda} related? Most oxidation catalysts exhibit either a layered structure as V{sub 2}O{sub 5}, and approximately {Lambda}{sub th}{proportional_to}{Lambda}{sub surf}, or a molecular structure as polyoxometallates, and no correction seems to be needed. Work is in progress on that point. Of great importance is also the actual oxidation and coordination states of cations at the stedy state: {Lambda}s have been calculated from the composition determined by XANES and XPS. Finally, the model is able to discriminate between `paraffins` and olefins as reactants. These calibration curves should help to find new catalysts. (orig.)

  15. BANKRUPTCY PREDICTION MODEL WITH ZETAc OPTIMAL CUT-OFF SCORE TO CORRECT TYPE I ERRORS

    Mohamad Iwan

    2005-06-01

    This research has successfully attained the following results: (1 type I error is in fact 59,83 times more costly compared to type II error, (2 22 ratios distinguish between bankrupt and non-bankrupt groups, (3 2 financial ratios proved to be effective in predicting bankruptcy, (4 prediction using ZETAc optimal cut-off score predicts more companies filing for bankruptcy within one year compared to prediction using Hair et al. optimum cutting score, (5 Although prediction using Hair et al. optimum cutting score is more accurate, prediction using ZETAc optimal cut-off score proved to be able to minimize cost incurred from classification errors.

  16. Basic study on dynamic reactive-power control method with PV output prediction for solar inverter

    Ryunosuke Miyoshi

    2016-01-01

    Full Text Available To effectively utilize a photovoltaic (PV system, reactive-power control methods for solar inverters have been considered. Among the various methods, the constant-voltage control outputs less reactive power compared with the other methods. We have developed a constant-voltage control to reduce the reactive-power output. However, the developed constant-voltage control still outputs unnecessary reactive power because the control parameter is constant in every waveform of the PV output. To reduce the reactive-power output, we propose a dynamic reactive-power control method with a PV output prediction. In the proposed method, the control parameter is varied according to the properties of the predicted PV waveform. In this study, we performed numerical simulations using a distribution system model, and we confirmed that the proposed method reduces the reactive-power output within the voltage constraint.

  17. Precise predictions of H2O line shapes over a wide pressure range using simulations corrected by a single measurement

    Ngo, N. H.; Nguyen, H. T.; Tran, H.

    2018-03-01

    In this work, we show that precise predictions of the shapes of H2O rovibrational lines broadened by N2, over a wide pressure range, can be made using simulations corrected by a single measurement. For that, we use the partially-correlated speed-dependent Keilson-Storer (pcsdKS) model whose parameters are deduced from molecular dynamics simulations and semi-classical calculations. This model takes into account the collision-induced velocity-changes effects, the speed dependences of the collisional line width and shift as well as the correlation between velocity and internal-state changes. For each considered transition, the model is corrected by using a parameter deduced from its broadening coefficient measured for a single pressure. The corrected-pcsdKS model is then used to simulate spectra for a wide pressure range. Direct comparisons of the corrected-pcsdKS calculated and measured spectra of 5 rovibrational lines of H2O for various pressures, from 0.1 to 1.2 atm, show very good agreements. Their maximum differences are in most cases well below 1%, much smaller than residuals obtained when fitting the measurements with the Voigt line shape. This shows that the present procedure can be used to predict H2O line shapes for various pressure conditions and thus the simulated spectra can be used to deduce the refined line-shape parameters to complete spectroscopic databases, in the absence of relevant experimental values.

  18. Prediction of e± elastic scattering cross-section ratio based on phenomenological two-photon exchange corrections

    Qattan, I. A.

    2017-06-01

    I present a prediction of the e± elastic scattering cross-section ratio, Re+e-, as determined using a new parametrization of the two-photon exchange (TPE) corrections to electron-proton elastic scattering cross section σR. The extracted ratio is compared to several previous phenomenological extractions, TPE hadronic calculations, and direct measurements from the comparison of electron and positron scattering. The TPE corrections and the ratio Re+e- show a clear change of sign at low Q2, which is necessary to explain the high-Q2 form factors discrepancy while being consistent with the known Q2→0 limit. While my predictions are in generally good agreement with previous extractions, TPE hadronic calculations, and existing world data including the recent two measurements from the CLAS and VEPP-3 Novosibirsk experiments, they are larger than the new OLYMPUS measurements at larger Q2 values.

  19. Exploring viewing behavior data from whole slide images to predict correctness of students' answers during practical exams in oral pathology.

    Walkowski, Slawomir; Lundin, Mikael; Szymas, Janusz; Lundin, Johan

    2015-01-01

    The way of viewing whole slide images (WSI) can be tracked and analyzed. In particular, it can be useful to learn how medical students view WSIs during exams and how their viewing behavior is correlated with correctness of the answers they give. We used software-based view path tracking method that enabled gathering data about viewing behavior of multiple simultaneous WSI users. This approach was implemented and applied during two practical exams in oral pathology in 2012 (88 students) and 2013 (91 students), which were based on questions with attached WSIs. Gathered data were visualized and analyzed in multiple ways. As a part of extended analysis, we tried to use machine learning approaches to predict correctness of students' answers based on how they viewed WSIs. We compared the results of analyses for years 2012 and 2013 - done for a single question, for student groups, and for a set of questions. The overall patterns were generally consistent across these 3 years. Moreover, viewing behavior data appeared to have certain potential for predicting answers' correctness and some outcomes of machine learning approaches were in the right direction. However, general prediction results were not satisfactory in terms of precision and recall. Our work confirmed that the view path tracking method is useful for discovering viewing behavior of students analyzing WSIs. It provided multiple useful insights in this area, and general results of our analyses were consistent across two exams. On the other hand, predicting answers' correctness appeared to be a difficult task - students' answers seem to be often unpredictable.

  20. Exploring viewing behavior data from whole slide images to predict correctness of students′ answers during practical exams in oral pathology

    Slawomir Walkowski

    2015-01-01

    Full Text Available The way of viewing whole slide images (WSI can be tracked and analyzed. In particular, it can be useful to learn how medical students view WSIs during exams and how their viewing behavior is correlated with correctness of the answers they give. We used software-based view path tracking method that enabled gathering data about viewing behavior of multiple simultaneous WSI users. This approach was implemented and applied during two practical exams in oral pathology in 2012 (88 students and 2013 (91 students, which were based on questions with attached WSIs. Gathered data were visualized and analyzed in multiple ways. As a part of extended analysis, we tried to use machine learning approaches to predict correctness of students′ answers based on how they viewed WSIs. We compared the results of analyses for years 2012 and 2013 - done for a single question, for student groups, and for a set of questions. The overall patterns were generally consistent across these 3 years. Moreover, viewing behavior data appeared to have certain potential for predicting answers′ correctness and some outcomes of machine learning approaches were in the right direction. However, general prediction results were not satisfactory in terms of precision and recall. Our work confirmed that the view path tracking method is useful for discovering viewing behavior of students analyzing WSIs. It provided multiple useful insights in this area, and general results of our analyses were consistent across two exams. On the other hand, predicting answers′ correctness appeared to be a difficult task - students′ answers seem to be often unpredictable.

  1. Reducing overlay sampling for APC-based correction per exposure by replacing measured data with computational prediction

    Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc

    2016-03-01

    One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.

  2. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  3. Simple prediction method of lumbar lordosis for planning of lumbar corrective surgery: radiological analysis in a Korean population.

    Lee, Chong Suh; Chung, Sung Soo; Park, Se Jun; Kim, Dong Min; Shin, Seong Kee

    2014-01-01

    This study aimed at deriving a lordosis predictive equation using the pelvic incidence and to establish a simple prediction method of lumbar lordosis for planning lumbar corrective surgery in Asians. Eighty-six asymptomatic volunteers were enrolled in the study. The maximal lumbar lordosis (MLL), lower lumbar lordosis (LLL), pelvic incidence (PI), and sacral slope (SS) were measured. The correlations between the parameters were analyzed using Pearson correlation analysis. Predictive equations of lumbar lordosis through simple regression analysis of the parameters and simple predictive values of lumbar lordosis using PI were derived. The PI strongly correlated with the SS (r = 0.78), and a strong correlation was found between the SS and LLL (r = 0.89), and between the SS and MLL (r = 0.83). Based on these correlations, the predictive equations of lumbar lordosis were found (SS = 0.80 + 0.74 PI (r = 0.78, R (2) = 0.61), LLL = 5.20 + 0.87 SS (r = 0.89, R (2) = 0.80), MLL = 17.41 + 0.96 SS (r = 0.83, R (2) = 0.68). When PI was between 30° to 35°, 40° to 50° and 55° to 60°, the equations predicted that MLL would be PI + 10°, PI + 5° and PI, and LLL would be PI - 5°, PI - 10° and PI - 15°, respectively. This simple calculation method can provide a more appropriate and simpler prediction of lumbar lordosis for Asian populations. The prediction of lumbar lordosis should be used as a reference for surgeons planning to restore the lumbar lordosis in lumbar corrective surgery.

  4. Dielectric properties for SF6 and SF6 mixtures predicted from basic data

    Kline, L.E.; Davies, D.K.; Chen, C.L.; Chantry, P.J.

    1979-01-01

    We have calculated α and eta, the ionization and attachment coefficients, and (E/N) *, the limiting breakdown electric-field--to--gas-density ratio, in SF 6 and SF 6 mixtures by numerically solving the Boltzmann equation for the electron energy distribution. The calculations require a knowledge of several electron collision cross sections. Published momentum transfer and ionization cross sections for SF 6 were used. We measured various attachment cross sections for SF 6 using electron-beam techniques with mass spectrometric ion detection. We determined a total cross section for electronic excitation of SF 6 by comparing the predicted values of α, eta, and (E/N) * with our measured values obtained from spatial current growth experiments in SF 6 in uniform fields over an extended range of E/N. With this self-consistent set of SF 6 cross sections, together with published He and N 2 cross sections, it was then possible to predict the dielectric properties of SF 6 -He and SF 6 -N 2 mixtures. Published experimental values of α for the SF 6 -He mixtures lie between the values of α calculated with and without ionization of SF 6 by excited He atoms. Published experimental values of (E/N) * agree with our calculations to within 5% in both the SF 6 -He and the SF 6 -N 2 mixtures

  5. Dielectric properties for SF6 and SF6 mixtures predicted from basic data

    Kline, L.E.; Davies, D.K.; Chen, C.L.; Chantry, P.J.

    1978-01-01

    α and eta, the ionization and attachment coefficients, and (E/N)*, the limiting breakdown electric field-to-gas density ratio, in SF 6 and SF 6 mixtures were calculated by numerically solving the Boltzmann equation for the electron energy distribution. The calculations require a knowledge of several electron collision cross sections. Published momentum transfer and ionization cross sections for SF 6 were used. Various attachment cross sections for SF 6 were measured by using electron beam techniques with mass spectrometric ion detection. A total cross section for electronic excitation of SF 6 was determined by comparing the predicted values of α, eta, and (E/N)* with measured values obtained from spatial current growth experiments in SF 6 in uniform fields over an extended range of E/N. With this self-consistent set of SF 6 cross sections, together with published He cross sections, it was then possible to predict the dielectric properties of SF 6 --He mixtures. Published experimental values of α for these mixtures lie between the values of α calculated with and without ionization of SF 6 by excited He atoms. Published experimental values of (E/N)* agree with the calculations to within 5%. 11 figures

  6. Predicting consumer liking and preference based on emotional responses and sensory perception: A study with basic taste solutions.

    Samant, Shilpa S; Chapko, Matthew J; Seo, Han-Seok

    2017-10-01

    Traditional methods of sensory testing focus on capturing information about multisensory perceptions, but do not necessarily measure emotions elicited by these food and beverages. The objective of this study was to develop an optimum model of predicting overall liking (rating) and preference (choice) based on taste intensity and evoked emotions. One hundred and two participants (51 females) were asked to taste water, sucrose, citric acid, salt, and caffeine solutions. Their emotional responses toward each sample were measured by a combination of a self-reported emotion questionnaire (EsSense25), facial expressions, and autonomic nervous system (ANS) responses. In addition, their perceived intensity and overall liking were measured. After a break, participants re-tasted the samples and ranked them according to their preference. The results showed that emotional responses measured using self-reported emotion questionnaire and facial expression analysis along with perceived taste intensity performed best to predict overall liking as well as preference, while ANS measures showed limited contribution. Contrary to some previous research, this study demonstrated that not only negative emotions, but also positive ones could help predict consumer liking and preference. In addition, since there were subtle differences in the prediction models of overall liking and preference, both aspects should be taken into account to understand consumer behavior. In conclusion, combination of evoked emotions along with sensory perception could help better understand consumer acceptance as well as preference toward basic taste solutions. Published by Elsevier Ltd.

  7. Do Basic Skills Predict Youth Unemployment (16- to 24-Year-Olds) Also when Controlled for Accomplished Upper-Secondary School? A Cross-Country Comparison

    Lundetrae, Kjersti; Gabrielsen, Egil; Mykletun, Reidar

    2010-01-01

    Basic skills and educational level are closely related, and both might affect employment. Data from the Adult Literacy and Life Skills Survey were used to examine whether basic skills in terms of literacy and numeracy predicted youth unemployment (16-24 years) while controlling for educational level. Stepwise logistic regression showed that in…

  8. Basic geriatric assessment does not predict in-hospital mortality after PEG placement

    Smoliner Christine

    2012-09-01

    .224-2.839 were identified as statistical risk factors for in–hospital death. Cognitive status did not have an influence on mortality (OR 0.447, CI 95% 0.248-1.650. Conclusion In a nationwide geriatric database, no component of the basic geriatric assessment emerged as a significant risk factor for mortality after PEG placement, emphasizing individual decision-making.

  9. Real time prediction and correction of ADCS problems in LEO satellites using fuzzy logic

    Yassin Mounir Yassin

    2017-06-01

    Full Text Available This approach is concerned with adapting the operations of attitude determination and control subsystem (ADCS of low earth orbit LEO satellites through analyzing the telemetry readings received by mission control center, and then responding to ADCS off-nominal situations. This can be achieved by sending corrective operational Tele-commands within real time. Our approach is related to the fuzzy membership of off-nominal telemetry readings of corrective actions through a set of fuzzy rules based on understanding the ADCS modes resulted from the satellite telemetry readings. Response in real time gives us a chance to avoid risky situations. The approach is tested on the EgyptSat-1 engineering model, which is our method to simulate the results.

  10. A Bayesian network based framework for real-time crash prediction on the basic freeway segments of urban expressways.

    Hossain, Moinul; Muromachi, Yasunori

    2012-03-01

    The concept of measuring the crash risk for a very short time window in near future is gaining more practicality due to the recent advancements in the fields of information systems and traffic sensor technology. Although some real-time crash prediction models have already been proposed, they are still primitive in nature and require substantial improvements to be implemented in real-life. This manuscript investigates the major shortcomings of the existing models and offers solutions to overcome them with an improved framework and modeling method. It employs random multinomial logit model to identify the most important predictors as well as the most suitable detector locations to acquire data to build such a model. Afterwards, it applies Bayesian belief net (BBN) to build the real-time crash prediction model. The model has been constructed using high resolution detector data collected from Shibuya 3 and Shinjuku 4 expressways under the jurisdiction of Tokyo Metropolitan Expressway Company Limited, Japan. It has been specifically built for the basic freeway segments and it predicts the chance of formation of a hazardous traffic condition within the next 4-9 min for a particular 250 meter long road section. The performance evaluation results reflect that at an average threshold value the model is able to successful classify 66% of the future crashes with a false alarm rate less than 20%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Accurate density functional prediction of molecular electron affinity with the scaling corrected Kohn–Sham frontier orbital energies

    Zhang, DaDi; Yang, Xiaolong; Zheng, Xiao; Yang, Weitao

    2018-04-01

    Electron affinity (EA) is the energy released when an additional electron is attached to an atom or a molecule. EA is a fundamental thermochemical property, and it is closely pertinent to other important properties such as electronegativity and hardness. However, accurate prediction of EA is difficult with density functional theory methods. The somewhat large error of the calculated EAs originates mainly from the intrinsic delocalisation error associated with the approximate exchange-correlation functional. In this work, we employ a previously developed non-empirical global scaling correction approach, which explicitly imposes the Perdew-Parr-Levy-Balduz condition to the approximate functional, and achieve a substantially improved accuracy for the calculated EAs. In our approach, the EA is given by the scaling corrected Kohn-Sham lowest unoccupied molecular orbital energy of the neutral molecule, without the need to carry out the self-consistent-field calculation for the anion.

  12. A national prediction model for PM2.5 component exposures and measurement error-corrected health effect inference.

    Bergen, Silas; Sheppard, Lianne; Sampson, Paul D; Kim, Sun-Young; Richards, Mark; Vedal, Sverre; Kaufman, Joel D; Szpiro, Adam A

    2013-09-01

    Studies estimating health effects of long-term air pollution exposure often use a two-stage approach: building exposure models to assign individual-level exposures, which are then used in regression analyses. This requires accurate exposure modeling and careful treatment of exposure measurement error. To illustrate the importance of accounting for exposure model characteristics in two-stage air pollution studies, we considered a case study based on data from the Multi-Ethnic Study of Atherosclerosis (MESA). We built national spatial exposure models that used partial least squares and universal kriging to estimate annual average concentrations of four PM2.5 components: elemental carbon (EC), organic carbon (OC), silicon (Si), and sulfur (S). We predicted PM2.5 component exposures for the MESA cohort and estimated cross-sectional associations with carotid intima-media thickness (CIMT), adjusting for subject-specific covariates. We corrected for measurement error using recently developed methods that account for the spatial structure of predicted exposures. Our models performed well, with cross-validated R2 values ranging from 0.62 to 0.95. Naïve analyses that did not account for measurement error indicated statistically significant associations between CIMT and exposure to OC, Si, and S. EC and OC exhibited little spatial correlation, and the corrected inference was unchanged from the naïve analysis. The Si and S exposure surfaces displayed notable spatial correlation, resulting in corrected confidence intervals (CIs) that were 50% wider than the naïve CIs, but that were still statistically significant. The impact of correcting for measurement error on health effect inference is concordant with the degree of spatial correlation in the exposure surfaces. Exposure model characteristics must be considered when performing two-stage air pollution epidemiologic analyses because naïve health effect inference may be inappropriate.

  13. Real-time axial motion detection and correction for single photon emission computed tomography using a linear prediction filter

    Saba, V.; Setayeshi, S.; Ghannadi-Maragheh, M.

    2011-01-01

    We have developed an algorithm for real-time detection and complete correction of the patient motion effects during single photon emission computed tomography. The algorithm is based on a linear prediction filter (LPC). The new prediction of projection data algorithm (PPDA) detects most motions-such as those of the head, legs, and hands-using comparison of the predicted and measured frame data. When the data acquisition for a specific frame is completed, the accuracy of the acquired data is evaluated by the PPDA. If patient motion is detected, the scanning procedure is stopped. After the patient rests in his or her true position, data acquisition is repeated only for the corrupted frame and the scanning procedure is continued. Various experimental data were used to validate the motion detection algorithm; on the whole, the proposed method was tested with approximately 100 test cases. The PPDA shows promising results. Using the PPDA enables us to prevent the scanner from collecting disturbed data during the scan and replaces them with motion-free data by real-time rescanning for the corrupted frames. As a result, the effects of patient motion is corrected in real time. (author)

  14. Prediction of pKa Values for Neutral and Basic Drugs based on Hybrid Artificial Intelligence Methods.

    Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin

    2018-03-05

    The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.

  15. Antibody modeling using the prediction of immunoglobulin structure (PIGS) web server [corrected].

    Marcatili, Paolo; Olimpieri, Pier Paolo; Chailyan, Anna; Tramontano, Anna

    2014-12-01

    Antibodies (or immunoglobulins) are crucial for defending organisms from pathogens, but they are also key players in many medical, diagnostic and biotechnological applications. The ability to predict their structure and the specific residues involved in antigen recognition has several useful applications in all of these areas. Over the years, we have developed or collaborated in developing a strategy that enables researchers to predict the 3D structure of antibodies with a very satisfactory accuracy. The strategy is completely automated and extremely fast, requiring only a few minutes (∼10 min on average) to build a structural model of an antibody. It is based on the concept of canonical structures of antibody loops and on our understanding of the way light and heavy chains pack together.

  16. Opportunities to Learn in School and at Home: How can they predict students' understanding of basic science concepts and principles?

    Wang, Su; Liu, Xiufeng; Zhao, Yandong

    2012-09-01

    As the breadth and depth of economic reforms increase in China, growing attention is being paid to equalities in opportunities to learn science by students of various backgrounds. In early 2009, the Chinese Ministry of Education and Ministry of Science and Technology jointly sponsored a national survey of urban eighth-grade students' science literacy along with their family and school backgrounds. The present study focused on students' understanding of basic science concepts and principles (BSCP), a subset of science literacy. The sample analyzed included 3,031 students from 109 randomly selected classes/schools. Correlation analysis, one-way analysis of variance, and two-level linear regression were conducted. The results showed that having a refrigerator, internet, more books, parents purchasing books and magazines related to school work, higher father's education level, and parents' higher expectation of the education level of their child significantly predicted higher BSCP scores; having siblings at home, owning an apartment, and frequently contacting teachers about the child significantly predicted lower BSCP scores. At the school level, the results showed that being in the first-tier or key schools, having school libraries, science popularization galleries, computer labs, adequate equipment for teaching, special budget for teacher training, special budget for science equipment, and mutual trust between teachers and students significantly predicated higher BSCP scores; and having science and technology rooms, offering science and technology interest clubs, special budget for science curriculum development, and special budget for science social practice activities significantly predicted lower BSCP scores. The implications of the above findings are discussed.

  17. Immediate postoperative outcome of orthognathic surgical planning, and prediction of positional changes in hard and soft tissue, independently of the extent and direction of the surgical corrections required

    Donatsky, Ole; Bjørn-Jørgensen, Jens; Hermund, Niels Ulrich

    2011-01-01

    orthognathic correction using the computerised, cephalometric, orthognathic, surgical planning system (TIOPS). Preoperative cephalograms were analysed and treatment plans and prediction tracings produced by computerised interactive simulation. The planned changes were transferred to models and finally...... with the presently included soft tissue algorithms, the current study shows relatively high mean predictability of the immediately postoperative hard and soft tissue outcome, independent of the extent and direction of required orthognathic correction. Because of the relatively high individual variability, caution...

  18. Prediction of CT Substitutes from MR Images Based on Local Diffeomorphic Mapping for Brain PET Attenuation Correction.

    Wu, Yao; Yang, Wei; Lu, Lijun; Lu, Zhentai; Zhong, Liming; Huang, Meiyan; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-10-01

    Attenuation correction is important for PET reconstruction. In PET/MR, MR intensities are not directly related to attenuation coefficients that are needed in PET imaging. The attenuation coefficient map can be derived from CT images. Therefore, prediction of CT substitutes from MR images is desired for attenuation correction in PET/MR. This study presents a patch-based method for CT prediction from MR images, generating attenuation maps for PET reconstruction. Because no global relation exists between MR and CT intensities, we propose local diffeomorphic mapping (LDM) for CT prediction. In LDM, we assume that MR and CT patches are located on 2 nonlinear manifolds, and the mapping from the MR manifold to the CT manifold approximates a diffeomorphism under a local constraint. Locality is important in LDM and is constrained by the following techniques. The first is local dictionary construction, wherein, for each patch in the testing MR image, a local search window is used to extract patches from training MR/CT pairs to construct MR and CT dictionaries. The k-nearest neighbors and an outlier detection strategy are then used to constrain the locality in MR and CT dictionaries. Second is local linear representation, wherein, local anchor embedding is used to solve MR dictionary coefficients when representing the MR testing sample. Under these local constraints, dictionary coefficients are linearly transferred from the MR manifold to the CT manifold and used to combine CT training samples to generate CT predictions. Our dataset contains 13 healthy subjects, each with T1- and T2-weighted MR and CT brain images. This method provides CT predictions with a mean absolute error of 110.1 Hounsfield units, Pearson linear correlation of 0.82, peak signal-to-noise ratio of 24.81 dB, and Dice in bone regions of 0.84 as compared with real CTs. CT substitute-based PET reconstruction has a regression slope of 1.0084 and R 2 of 0.9903 compared with real CT-based PET. In this method, no

  19. Iris-fixated phakic intraocular lens implantation to correct myopia and a predictive model of endothelial cell loss.

    Bouheraoua, Nacim; Bonnet, Clemence; Labbé, Antoine; Sandali, Otman; Lecuen, Nicolas; Ameline, Barbara; Borderie, Vincent; Laroche, Laurent

    2015-11-01

    To report long-term results of Artisan phakic intraocular lens (pIOL) to correct myopia and to propose a model predicting endothelial cell loss after pIOL implantation. Quinze-Vingts National Ophthalmology Hospital, Paris, France. Retrospective, interventional case series. Uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), and central endothelial cell count (ECC) were determined before and at yearly intervals up to 5 years after pIOL implantation. Linear model analysis was performed to present a model that describes endothelial cell loss as a linear decrease and an additional decrease depending on postoperative loss. A total of 49 patients (68 eyes) implanted with pIOLs from January 2000 to January 2009 were evaluated. The mean preoperative and final spherical equivalent (SE) were -13 ± 4.10 and -0.75 ± 0.74 diopters (D), respectively. The mean preoperative and final central ECC were 2629 ± 366 and 2250 ± 454 cells/mm(2), respectively. There were no intraoperative complications for any of the eyes. One eye required surgery for repositioning the pIOL, and 1 eye required pIOL exchange for postoperative refractive error. The model predicted that for patients with preoperative ECC of 3000, 2500, and 2000 cells/mm(2), a critical ECC of 1500 cells/mm(2) will be reached at 39, 28, and 15 years after implantation, respectively. Implantation of the pIOL was an effective and stable procedure after 5 years of follow-up. The presented model predicted EC loss after pIOL implantation, which can assist ophthalmologists in patient selection and follow-up. The authors report no conflict of interest. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  20. Statistical model for predicting correct amount of deoxidizer of Al-killed grade casted at slab continuous caster of Pakistan steel

    Siddiqui, A.R.; Khan, M.M.A.; Ismail, B.M.

    1999-01-01

    Oxygen is blown in Converter process to oxidize hot metal. This introduces dissolved oxygen in the metal, which may cause embrittlement, voids, inclusion and other undesirable properties in steel. The steel bath at the time of tapping contains 400 to 800 ppm oxygen. Deoxidation is carried out during tapping by adding into the tap ladle appropriate amounts of ferromanganese, ferrosilicon and/or aluminum or other special deoxidizers. In the research aluminum killed grade steel which are casted at the slab caster of Pakistan Steel were investigated. Amount of aluminum added is very critical because if we add lesser amount of aluminum then the required quantity then there will be an incomplete killing of oxygen which results uncleanness in steel. Addition of larger amount of aluminum not only increases the cost of the production but also results as higher amount of alumina, which results in nozzle clogging and increase, loses. The purpose of the research is to develop a statistical model which would predict correct amount of aluminum addition for complete deoxidation of aluminum killed grade casted at slab continuous caster of Pakistan Steel. In the model aluminum added is taken as dependent variable while tapping temperature, turn down carbon composition, turndown manganese composition and oxygen content in steel would be the independent variable. This work is based on operational practice on 130 tons Basic Oxygen furnace. (author)

  1. SU-F-J-219: Predicting Ventilation Change Due to Radiation Therapy: Dependency On Pre-RT Ventilation and Effort Correction

    Patton, T; Du, K; Bayouth, J [University of Wisconsin, Madison, WI (United States); Christensen, G; Reinhardt, J [University of Iowa, Iowa City, IA (United States)

    2016-06-15

    Purpose: Ventilation change caused by radiation therapy (RT) can be predicted using four-dimensional computed tomography (4DCT) and image registration. This study tested the dependency of predicted post-RT ventilation on effort correction and pre-RT lung function. Methods: Pre-RT and 3 month post-RT 4DCT images were obtained for 13 patients. The 4DCT images were used to create ventilation maps using a deformable image registration based Jacobian expansion calculation. The post-RT ventilation maps were predicted in four different ways using the dose delivered, pre-RT ventilation, and effort correction. The pre-RT ventilation and effort correction were toggled to determine dependency. The four different predicted ventilation maps were compared to the post-RT ventilation map calculated from image registration to establish the best prediction method. Gamma pass rates were used to compare the different maps with the criteria of 2mm distance-to-agreement and 6% ventilation difference. Paired t-tests of gamma pass rates were used to determine significant differences between the maps. Additional gamma pass rates were calculated using only voxels receiving over 20 Gy. Results: The predicted post-RT ventilation maps were in agreement with the actual post-RT maps in the following percentage of voxels averaged over all subjects: 71% with pre-RT ventilation and effort correction, 69% with no pre-RT ventilation and effort correction, 60% with pre-RT ventilation and no effort correction, and 58% with no pre-RT ventilation and no effort correction. When analyzing only voxels receiving over 20 Gy, the gamma pass rates were respectively 74%, 69%, 65%, and 55%. The prediction including both pre- RT ventilation and effort correction was the only prediction with significant improvement over using no prediction (p<0.02). Conclusion: Post-RT ventilation is best predicted using both pre-RT ventilation and effort correction. This is the only prediction that provided a significant

  2. Seasonal predictions of equatorial Atlantic SST in a low-resolution CGCM with surface heat flux correction

    Dippe, Tina; Greatbatch, Richard; Ding, Hui

    2016-04-01

    The dominant mode of interannual variability in tropical Atlantic sea surface temperatures (SSTs) is the Atlantic Niño or Zonal Mode. Akin to the El Niño-Southern Oscillation in the Pacific sector, it is able to impact the climate both of the adjacent equatorial African continent and remote regions. Due to heavy biases in the mean state climate of the equatorial-to-subtropical Atlantic, however, most state-of-the-art coupled global climate models (CGCMs) are unable to realistically simulate equatorial Atlantic variability. In this study, the Kiel Climate Model (KCM) is used to investigate the impact of a simple bias alleviation technique on the predictability of equatorial Atlantic SSTs. Two sets of seasonal forecasting experiments are performed: An experiment using the standard KCM (STD), and an experiment with additional surface heat flux correction (FLX) that efficiently removes the SST bias from simulations. Initial conditions for both experiments are generated by the KCM run in partially coupled mode, a simple assimilation technique that forces the KCM with observed wind stress anomalies and preserves SST as a fully prognostic variable. Seasonal predictions for both sets of experiments are run four times yearly for 1981-2012. Results: Heat flux correction substantially improves the simulated variability in the initialization runs for boreal summer and fall (June-October). In boreal spring (March-May), however, neither the initialization runs of the STD or FLX-experiments are able to capture the observed variability. FLX-predictions show no consistent enhancement of skill relative to the predictions of the STD experiment over the course of the year. The skill of persistence forecasts is hardly beat by either of the two experiments in any season, limiting the usefulness of the few forecasts that show significant skill. However, FLX-forecasts initialized in May recover skill in July and August, the peak season of the Atlantic Niño (anomaly correlation

  3. Assessment of a non-uniform heat flux correction model to predicting CHF in PWR rod bundles

    Dae-Hyun, Hwang; Sung-Quun, Zee

    2001-01-01

    The full text follows. The prediction of CHF (critical heat flux) has been, in most cases, based on the empirical correlation. For PWR fuel assemblies the local parameter correlation requires the local thermal-hydraulic conditions usually calculated by a subchannel analysis code. The cross-sectional averaged fluid conditions of the subchannel, however, are not sufficient for determining CHF, especially for the cases of non-uniform axial heat flux distributions. Many investigators have studied the effect of the upstream heat flux on the CHF. In terms of the upstream memory effect, two different approaches have been considered as the limiting cases. The 'local conditions' hypothesis assumes that there is a unique relationship between the CHF and the local thermal-hydraulic conditions, and consequently there is no memory effect. In the 'overall power' hypothesis, on the other hand, it is assumed that the total power which can be fed into the tube with nonuniform heating will be the same as that for a uniformly heated tube of the same heated length with the same inlet conditions. Thus the CHF is totally influenced by the upstream heat flux distribution. In view of some experimental investigations such as the DeBortoli's test, it revealed that the two approaches are inadequate in general. It means that the local critical heat flux may be affected to some extent by the heat flux distribution upstream of the CHF location. Some correction-factor models have been suggested to take into account the upstream memory effect. Typically, Tong devised a correction factor on the basis of the heat balance of the superheated liquid layer that is spread underneath a highly viscous bubbly layer along the heated surface. His physical model suggested that the fluid enthalpy obtained from an energy balance of the superheated liquid layer is a representative quantity for the onset of DNB (departure nucleate boiling). A theoretically based correction factor model has been proposed by the

  4. The predictive value of demonstrable stress incontinence during basic office evaluation and urodynamics in women without symptomatic urinary incontinence undergoing vaginal prolapse surgery

    van der Ploeg, J. Marinus; Zwolsman, Sandra E.; Posthuma, Selina; Wiarda, Hylco S.; van der Vaart, C. Huub; Roovers, Jan-Paul W. R.

    2017-01-01

    Women with pelvic organ prolapse without symptoms of urinary incontinence (UI) might demonstrate stress urinary incontinence (SUI) with or without prolapse reduction. We aimed to determine the value of demonstrable SUI during basic office evaluation or urodynamics in predicting SUI after vaginal

  5. Spatial measurement error and correction by spatial SIMEX in linear regression models when using predicted air pollution exposures.

    Alexeeff, Stacey E; Carroll, Raymond J; Coull, Brent

    2016-04-01

    Spatial modeling of air pollution exposures is widespread in air pollution epidemiology research as a way to improve exposure assessment. However, there are key sources of exposure model uncertainty when air pollution is modeled, including estimation error and model misspecification. We examine the use of predicted air pollution levels in linear health effect models under a measurement error framework. For the prediction of air pollution exposures, we consider a universal Kriging framework, which may include land-use regression terms in the mean function and a spatial covariance structure for the residuals. We derive the bias induced by estimation error and by model misspecification in the exposure model, and we find that a misspecified exposure model can induce asymptotic bias in the effect estimate of air pollution on health. We propose a new spatial simulation extrapolation (SIMEX) procedure, and we demonstrate that the procedure has good performance in correcting this asymptotic bias. We illustrate spatial SIMEX in a study of air pollution and birthweight in Massachusetts. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  7. Robust Inference of Population Structure for Ancestry Prediction and Correction of Stratification in the Presence of Relatedness

    Conomos, Matthew P.; Miller, Mike; Thornton, Timothy

    2016-01-01

    Population structure inference with genetic data has been motivated by a variety of applications in population genetics and genetic association studies. Several approaches have been proposed for the identification of genetic ancestry differences in samples where study participants are assumed to be unrelated, including principal components analysis (PCA), multi-dimensional scaling (MDS), and model-based methods for proportional ancestry estimation. Many genetic studies, however, include individuals with some degree of relatedness, and existing methods for inferring genetic ancestry fail in related samples. We present a method, PC-AiR, for robust population structure inference in the presence of known or cryptic relatedness. PC-AiR utilizes genome-screen data and an efficient algorithm to identify a diverse subset of unrelated individuals that is representative of all ancestries in the sample. The PC-AiR method directly performs PCA on the identified ancestry representative subset and then predicts components of variation for all remaining individuals based on genetic similarities. In simulation studies and in applications to real data from Phase III of the HapMap Project, we demonstrate that PC-AiR provides a substantial improvement over existing approaches for population structure inference in related samples. We also demonstrate significant efficiency gains, where a single axis of variation from PC-AiR provides better prediction of ancestry in a variety of structure settings than using ten (or more) components of variation from widely used PCA and MDS approaches. Finally, we illustrate that PC-AiR can provide improved population stratification correction over existing methods in genetic association studies with population structure and relatedness. PMID:25810074

  8. Using soft computing techniques to predict corrected air permeability using Thomeer parameters, air porosity and grain density

    Nooruddin, Hasan A.; Anifowose, Fatai; Abdulraheem, Abdulazeez

    2014-03-01

    Soft computing techniques are recently becoming very popular in the oil industry. A number of computational intelligence-based predictive methods have been widely applied in the industry with high prediction capabilities. Some of the popular methods include feed-forward neural networks, radial basis function network, generalized regression neural network, functional networks, support vector regression and adaptive network fuzzy inference system. A comparative study among most popular soft computing techniques is presented using a large dataset published in literature describing multimodal pore systems in the Arab D formation. The inputs to the models are air porosity, grain density, and Thomeer parameters obtained using mercury injection capillary pressure profiles. Corrected air permeability is the target variable. Applying developed permeability models in recent reservoir characterization workflow ensures consistency between micro and macro scale information represented mainly by Thomeer parameters and absolute permeability. The dataset was divided into two parts with 80% of data used for training and 20% for testing. The target permeability variable was transformed to the logarithmic scale as a pre-processing step and to show better correlations with the input variables. Statistical and graphical analysis of the results including permeability cross-plots and detailed error measures were created. In general, the comparative study showed very close results among the developed models. The feed-forward neural network permeability model showed the lowest average relative error, average absolute relative error, standard deviations of error and root means squares making it the best model for such problems. Adaptive network fuzzy inference system also showed very good results.

  9. Three-dimensional transport coefficient model and prediction-correction numerical method for thermal margin analysis of PWR cores

    Chiu, C.

    1981-01-01

    Combustion Engineering Inc. designs its modern PWR reactor cores using open-core thermal-hydraulic methods where the mass, momentum and energy equations are solved in three dimensions (one axial and two lateral directions). The resultant fluid properties are used to compute the minimum Departure from Nuclear Boiling Ratio (DNBR) which ultimately sets the power capability of the core. The on-line digital monitoring and protection systems require a small fast-running algorithm of the design code. This paper presents two techniques used in the development of the on-line DNB algorithm. First, a three-dimensional transport coefficient model is introduced to radially group the flow subchannel into channels for the thermal-hydraulic fluid properties calculation. Conservation equations of mass, momentum and energy for this channels are derived using transport coefficients to modify the calculation of the radial transport of enthalpy and momentum. Second, a simplified, non-iterative numerical method, called the prediction-correction method, is applied together with the transport coefficient model to reduce the computer execution time in the determination of fluid properties. Comparison of the algorithm and the design thermal-hydraulic code shows agreement to within 0.65% equivalent power at a 95/95 confidence/probability level for all normal operating conditions of the PWR core. This algorithm accuracy is achieved with 1/800th of the computer processing time of its parent design code. (orig.)

  10. Z-LASIK and Trans-PRK for correction of high-grade myopia: safety, efficacy, predictability and clinical outcomes.

    Gershoni, Assaf; Mimouni, Michael; Livny, Eitan; Bahar, Irit

    2018-03-12

    The aim of the study was to examine the outcomes of transepithelial photorefractive keratectomy (Trans-PRK) and Femtosecond Laser-assisted in situ keratomileusis (Z-LASIK) for the correction of high myopia. A retrospective cohort study design was used. The study group included 792 eyes with high-grade myopia (- 6.0 diopters or higher) or high-grade myopia with astigmatism that were treated with Z-LASIK or Trans-PRK in 2013 through 2014 in an optical outpatient clinic of a large private medical service. The Trans-PRK group comprised of 674 eyes with a spherical equivalent (SE) of - 7.87 ± 1.46 and the Z-LASIK group comprised of 118 eyes with a SE of - 7.19 ± 0.81 (P PRK group was - 0.06 and - 0.02 in the Z-LASIK group (P = 0.545). Efficacy index values were 0.92 in the Trans-PRK group and 0.95 in the Z-LASIK group (P = 0.083), and corresponding safety index values were 0.95 and 0.97 (P = 0.056). An UCVA of 20/40 or better was achieved in 94.20% of eyes in the Trans-PRK group, and 98.31% in the Z-LASIK group (P = 0.063). The majority of eyes in both the Trans-PRK and Z-LASIK groups were within ± 0.5D of attempted correction: 59.35 and 64.71%, respectively (P = 0.271). Both Trans-PRK and Z-LASIK demonstrated excellent efficacy, safety and predictability profiles, with results comparable and in some cases superior to the current literature. Results of Z-LASIK were slightly better than those of Trans-PRK, though the preoperative SE of the latter was higher.

  11. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  12. Predicting erectile dysfunction following surgical correction of Peyronie's disease without inflatable penile prosthesis placement: vascular assessment and preoperative risk factors.

    Taylor, Frederick L; Abern, Michael R; Levine, Laurence A

    2012-01-01

    Surgical therapy remains the gold standard treatment for Peyronie's Disease (PD). Surgical options include plication, grafting, and placement of inflatable penile prosthesis (IPP). Postoperative erectile dysfunction (ED) is a potential complication for PD surgery without IPP. We present our large series follow-up to evaluate preoperative risk factors for postoperative ED. The aim of this study is to evaluate preoperative risk factors for the development of ED following surgical correction of PD taking into account the degree of curvature, graft size, surgical approach, hypertension, hyperlipidemia, diabetes, smoking history, preoperative use of phosphodiesterase 5 inhibitors (PDE5), and preoperative duplex ultrasound findings including peak systolic and end diastolic velocities and resistive index. We identified 218 men undergoing either tunica albuginea plication (TAP) or partial plaque excision with pericardial grafting for PD following a previously published algorithm between November 1992 and April 2007. Preoperative and postoperative erectile function, curvature characteristics, presence of vascular risk factors, and duplex ultrasound findings were available on 109 patients. Our primary outcome measure is the development of ED after surgery for PD. Ten percent of TAP and 21% of plaque excision with grafting patients developed postoperative ED. Neither curve direction (P = 0.76), graft area (P = 0.78), surgical approach (P = 0.12), chronic hypertension (P = 0.51), hyperlipidemia (P = 0.87), diabetes (P = 0.69), nor smoking history (P = 0.99) were significant predictors of postoperative ED. No combination of risk factors was found to be predictive of postoperative ED. Preoperative use of PDE5 was not a significant predictor of postoperative ED (P = 0.33). Neither peak systolic, end diastolic, nor resistive index were significant predictors of ED (P = 0.28, 0.28, and 0.25, respectively). This long-term follow-up of a large published series suggests that neither

  13. Effect of heart rate correction on pre- and post-exercise heart rate variability to predict risk of mortality – an experimental study on the FINCAVAS cohort

    Paruthi ePradhapan

    2014-06-01

    Full Text Available The non-linear inverse relationship between RR-intervals and heart rate (HR contributes significantly to the heart rate variability (HRV parameters and their performance in mortality prediction. To determine the level of influence HR exerts over HRV parameters’ prognostic power, we studied the predictive performance for different HR levels by applying eight correction procedures, multiplying or dividing HRV parameters by the mean RR-interval (RRavg to the power 0.5-16. Data collected from 1288 patients in The Finnish Cardiovascular Study (FINCAVAS, who satisfied the inclusion criteria, was used for the analyses. HRV parameters (RMSSD, VLF Power and LF Power were calculated from 2-minute segment in the rest phase before exercise and 2-minute recovery period immediately after peak exercise. Area under the receiver operating characteristic curve (AUC was used to determine the predictive performance for each parameter with and without HR corrections in rest and recovery phases. The division of HRV parameters by segment’s RRavg to the power 2 (HRVDIV-2 showed the highest predictive performance under the rest phase (RMSSD: 0.67/0.66; VLF Power: 0.70/0.62; LF Power: 0.79/0.65; cardiac mortality/non-cardiac mortality with minimum correlation to HR (r = -0.15 to 0.15. In the recovery phase, Kaplan-Meier (KM survival analysis revealed good risk stratification capacity at HRVDIV-2 in both groups (cardiac and non-cardiac mortality. Although higher powers of correction (HRVDIV-4 and HRVDIV-8 improved predictive performance during recovery, they induced an increased positive correlation to HR. Thus, we inferred that predictive capacity of HRV during rest and recovery is augmented when its dependence on HR is weakened by applying appropriate correction procedures.

  14. A 3D correction method for predicting the readings of a PinPoint chamber on the CyberKnife® M6™ machine

    Zhang, Yongqian; Brandner, Edward; Ozhasoglu, Cihat; Lalonde, Ron; Heron, Dwight E.; Saiful Huq, M.

    2018-02-01

    The use of small fields in radiation therapy techniques has increased substantially in particular in stereotactic radiosurgery (SRS) and stereotactic body radiation therapy (SBRT). However, as field size reduces further still, the response of the detector changes more rapidly with field size, and the effects of measurement uncertainties become increasingly significant due to the lack of lateral charged particle equilibrium, spectral changes as a function of field size, detector choice, and subsequent perturbations of the charged particle fluence. This work presents a novel 3D dose volume-to-point correction method to predict the readings of a 0.015 cc PinPoint chamber (PTW 31014) for both small static-fields and composite-field dosimetry formed by fixed cones on the CyberKnife® M6™ machine. A 3D correction matrix is introduced to link the 3D dose distribution to the response of the PinPoint chamber in water. The parameters of the correction matrix are determined by modeling its 3D dose response in circular fields created using the 12 fixed cones (5 mm-60 mm) on a CyberKnife® M6™ machine. A penalized least-square optimization problem is defined by fitting the calculated detector reading to the experimental measurement data to generate the optimal correction matrix; the simulated annealing algorithm is used to solve the inverse optimization problem. All the experimental measurements are acquired for every 2 mm chamber shift in the horizontal planes for each field size. The 3D dose distributions for the measurements are calculated using the Monte Carlo calculation with the MultiPlan® treatment planning system (Accuray Inc., Sunnyvale, CA, USA). The performance evaluation of the 3D conversion matrix is carried out by comparing the predictions of the output factors (OFs), off-axis ratios (OARs) and percentage depth dose (PDD) data to the experimental measurement data. The discrepancy of the measurement and the prediction data for composite fields is also

  15. Projectile-z3 and -z4 corrections to basic Bethe-Bloch stopping power theory and mean excitation energies of Al, Si, Ni, Ge, Se, Y, Ag and Au

    Porter, L.E.; Bryan, S.R.

    1980-01-01

    Three independent sets of measurements of the stopping power of solid elemental targets for alpha particles were previously analyzed in terms of basic Bethe-Bloch theory with the low velocity projectile-z 3 correction term included. These data for Al, Si, Ni, Ge, Se, Y, Ag and Au have now been analyzed with the Bloch projectile-z 4 term and a revised projectile-z 3 term incorporated in the Bethe-Bloch formula, the projectile-z 3 revision having been effected by variation of the single free parameter of the projectile-z 3 effect formalism. The value of this parameter, fixed at 1.8 in previous studies, which counteracts inclusion of the projectile-z 4 term is 1.3 +- 0.1 for all target elements except Si. (orig.)

  16. Basic features of the predictive tools of early warning systems for water-related natural hazards: examples for shallow landslides

    R. Greco

    2017-12-01

    Full Text Available To manage natural risks, an increasing effort is being put in the development of early warning systems (EWS, namely, approaches facing catastrophic phenomena by timely forecasting and alarm spreading throughout exposed population. Research efforts aimed at the development and implementation of effective EWS should especially concern the definition and calibration of the interpretative model. This paper analyses the main features characterizing predictive models working in EWS by discussing their aims and their features in terms of model accuracy, evolutionary stage of the phenomenon at which the prediction is carried out and model architecture. Original classification criteria based on these features are developed throughout the paper and shown in their practical implementation through examples of flow-like landslides and earth flows, both of which are characterized by rapid evolution and quite representative of many applications of EWS.

  17. Basic features of the predictive tools of early warning systems for water-related natural hazards: examples for shallow landslides

    Greco, Roberto; Pagano, Luca

    2017-12-01

    To manage natural risks, an increasing effort is being put in the development of early warning systems (EWS), namely, approaches facing catastrophic phenomena by timely forecasting and alarm spreading throughout exposed population. Research efforts aimed at the development and implementation of effective EWS should especially concern the definition and calibration of the interpretative model. This paper analyses the main features characterizing predictive models working in EWS by discussing their aims and their features in terms of model accuracy, evolutionary stage of the phenomenon at which the prediction is carried out and model architecture. Original classification criteria based on these features are developed throughout the paper and shown in their practical implementation through examples of flow-like landslides and earth flows, both of which are characterized by rapid evolution and quite representative of many applications of EWS.

  18. Predictive Models to Estimate Probabilities of Injuries and Adverse Performance Outcomes in U.S. Army Basic Combat Training

    2014-03-01

    orofacial injuries.10 These and other efforts have been associated with reduced BCT injuries over time as shown in Figure 111 but injury incidence...to predict first episode of low back pain in Soldiers undergoing combat medic training. Moran et al30 reported an AUG of . 765 for a pragmatic 5...Dugan JL, Robinson ME. Predictors of occurrence and severity of first time low back pain episodes: Findings from a military inception cohort. PLoS

  19. Predicting Stress Related to Basic Needs and Safety in Darfur Refugee Camps: A Structural and Social Ecological Analysis.

    Rasmussen, Andrew; Annan, Jeannie

    2010-03-01

    The research on the determinants of mental health among refugees has been largely limited to traumatic events, but recent work has indicated that the daily hassles of living in refugee camps also play a large role. Using hierarchical linear modelling to account for refugees nested within camp blocks, this exploratory study attempted to model stress surrounding safety and acquiring basic needs and functional impairment among refugees from Darfur living in Chad, using individual-level demographics (e.g., gender, age, presence of a debilitating injury), structural factors (e.g., distance from block to distribution centre), and social ecological variables (e.g., percentage of single women within a block). We found that stress concerning safety concerns, daily hassles, and functional impairment were associated with several individual-level demographic factors (e.g., gender), but also with interactions between block-level and individual-level factors as well (e.g., injury and distance to distribution centre). Findings are discussed in terms of monitoring and evaluation of refugee services.

  20. A Geometrical-based Vertical Gain Correction for Signal Strength Prediction of Downtilted Base Station Antennas in Urban Areas

    Rodriguez, Ignacio; Nguyen, Huan Cong; Sørensen, Troels Bundgaard

    2012-01-01

    -based extension to standard empirical path loss prediction models can give quite reasonable accuracy in predicting the signal strength from tilted base station antennas in small urban macro-cells. Our evaluation is based on measurements on several sectors in a 2.6 GHz Long Term Evolution (LTE) cellular network......, with electrical antenna downtilt in the range from 0 to 10 degrees, as well as predictions based on ray-tracing and 3D building databases covering the measurement area. Although the calibrated ray-tracing predictions are highly accurate compared with the measured data, the combined LOS/NLOS COST-WI model...

  1. Intermittently-visual Tracking Experiments Reveal the Roles of Error-correction and Predictive Mechanisms in the Human Visual-motor Control System

    Hayashi, Yoshikatsu; Tamura, Yurie; Sase, Kazuya; Sugawara, Ken; Sawada, Yasuji

    Prediction mechanism is necessary for human visual motion to compensate a delay of sensory-motor system. In a previous study, “proactive control” was discussed as one example of predictive function of human beings, in which motion of hands preceded the virtual moving target in visual tracking experiments. To study the roles of the positional-error correction mechanism and the prediction mechanism, we carried out an intermittently-visual tracking experiment where a circular orbit is segmented into the target-visible regions and the target-invisible regions. Main results found in this research were following. A rhythmic component appeared in the tracer velocity when the target velocity was relatively high. The period of the rhythm in the brain obtained from environmental stimuli is shortened more than 10%. The shortening of the period of rhythm in the brain accelerates the hand motion as soon as the visual information is cut-off, and causes the precedence of hand motion to the target motion. Although the precedence of the hand in the blind region is reset by the environmental information when the target enters the visible region, the hand motion precedes the target in average when the predictive mechanism dominates the error-corrective mechanism.

  2. Resposta de mudas de goiabeira à aplicação de escória de siderurgia como corretivo de acidez do solo Response of guava plants to basic slag application as corrective of soil acidity

    Renato de Mello Prado

    2003-04-01

    half; twice and twice and half to raise V =70%. After 90 days of the incubation of the basic slag in soil, the plantation was proceeded in vegetative propagation, in Ultisol (pot with 2.8 dm³, cultivating them per 105 days. The basic slag application positively affected values of pH, SB and V%, and the concentrations of Ca, Mg and P and H+Al in the soil. The guava plants had significant by increased the height, number of leaves and the foliar area, the concentration of Ca, Mg and P of the aerial part and the root of the plants and, consequently, the dry matter of the aerial part and the root. Therefore the basic slag revealed viable in the production of young guava plants as corrective of acidity of soil and source of nutrients (Ca e Mg.

  3. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  4. “Booster” training: Evaluation of instructor-led bedside cardiopulmonary resuscitation skill training and automated corrective feedback to improve cardiopulmonary resuscitation compliance of Pediatric Basic Life Support providers during simulated cardiac arrest

    Sutton, Robert M.; Niles, Dana; Meaney, Peter A.; Aplenc, Richard; French, Benjamin; Abella, Benjamin S.; Lengetti, Evelyn L.; Berg, Robert A.; Helfaer, Mark A.; Nadkarni, Vinay

    2013-01-01

    Objective To investigate the effectiveness of brief bedside “booster” cardiopulmonary resuscitation (CPR) training to improve CPR guideline compliance of hospital-based pediatric providers. Design Prospective, randomized trial. Setting General pediatric wards at Children’s Hospital of Philadelphia. Subjects Sixty-nine Basic Life Support–certified hospital-based providers. Intervention CPR recording/feedback defibrillators were used to evaluate CPR quality during simulated pediatric arrest. After a 60-sec pretraining CPR evaluation, subjects were randomly assigned to one of three instructional/feedback methods to be used during CPR booster training sessions. All sessions (training/CPR manikin practice) were of equal duration (2 mins) and differed only in the method of corrective feedback given to participants during the session. The study arms were as follows: 1) instructor-only training; 2) automated defibrillator feedback only; and 3) instructor training combined with automated feedback. Measurements and Main Results Before instruction, 57% of the care providers performed compressions within guideline rate recommendations (rate >90 min−1 and 38 mm); and 36% met overall CPR compliance (rate and depth within targets). After instruction, guideline compliance improved (instructor-only training: rate 52% to 87% [p .01], and overall CPR compliance, 43% to 78% [p CPR compliance, 35% to 96% [p training combined with automated feedback: rate 48% to 100% [p CPR compliance, 30% to 100% [p CPR instruction, most certified Pediatric Basic Life Support providers did not perform guideline-compliant CPR. After a brief bedside training, CPR quality improved irrespective of training content (instructor vs. automated feedback). Future studies should investigate bedside training to improve CPR quality during actual pediatric cardiac arrests. PMID:20625336

  5. "Booster" training: evaluation of instructor-led bedside cardiopulmonary resuscitation skill training and automated corrective feedback to improve cardiopulmonary resuscitation compliance of Pediatric Basic Life Support providers during simulated cardiac arrest.

    Sutton, Robert M; Niles, Dana; Meaney, Peter A; Aplenc, Richard; French, Benjamin; Abella, Benjamin S; Lengetti, Evelyn L; Berg, Robert A; Helfaer, Mark A; Nadkarni, Vinay

    2011-05-01

    To investigate the effectiveness of brief bedside "booster" cardiopulmonary resuscitation (CPR) training to improve CPR guideline compliance of hospital-based pediatric providers. Prospective, randomized trial. General pediatric wards at Children's Hospital of Philadelphia. Sixty-nine Basic Life Support-certified hospital-based providers. CPR recording/feedback defibrillators were used to evaluate CPR quality during simulated pediatric arrest. After a 60-sec pretraining CPR evaluation, subjects were randomly assigned to one of three instructional/feedback methods to be used during CPR booster training sessions. All sessions (training/CPR manikin practice) were of equal duration (2 mins) and differed only in the method of corrective feedback given to participants during the session. The study arms were as follows: 1) instructor-only training; 2) automated defibrillator feedback only; and 3) instructor training combined with automated feedback. Before instruction, 57% of the care providers performed compressions within guideline rate recommendations (rate >90 min(-1) and 38 mm); and 36% met overall CPR compliance (rate and depth within targets). After instruction, guideline compliance improved (instructor-only training: rate 52% to 87% [p .01], and overall CPR compliance, 43% to 78% [p CPR compliance, 35% to 96% [p training combined with automated feedback: rate 48% to 100% [p CPR compliance, 30% to 100% [p CPR instruction, most certified Pediatric Basic Life Support providers did not perform guideline-compliant CPR. After a brief bedside training, CPR quality improved irrespective of training content (instructor vs. automated feedback). Future studies should investigate bedside training to improve CPR quality during actual pediatric cardiac arrests.

  6. Hygiene Basics

    ... Staying Safe Videos for Educators Search English Español Hygiene Basics KidsHealth / For Teens / Hygiene Basics What's in this article? Oily Hair Sweat ... smell, anyway? Read below for information on some hygiene basics — and learn how to deal with greasy ...

  7. Correction to: CASPer, an online pre-interview screen for personal/professional characteristics: prediction of national licensure scores.

    Dore, Kelly L; Reiter, Harold I; Kreuger, Sharyn; Norman, Geoffrey R

    2017-12-01

    In re-examining the paper "CASPer, an online pre-interview screen for personal/professional characteristics: prediction of national licensure scores" published in AHSE (22(2), 327-336), we recognized two errors of interpretation.

  8. Determination of Multiphase Flow Meter Reliability and Development of Correction Charts for the Prediction of Oilfield Fluid Flow Rates

    Samuel S. MOFUNLEWI

    2008-06-01

    Full Text Available The aim of field testing of Multiphase Flow Meter (MPFM is to show whether its accuracy compares favourably with that of the Test Separator in accurately measuring the three production phases (oil, gas and water as well as determining meter reliability in field environment. This study evaluates field test results of the MPFM as compared to reference conventional test separators. Generally, results show that MPFM compares favourably with Test Separator within the specified range of accuracy.At the moment, there is no legislation for meter proving technique for MPFM. However, this study has developed calibration charts that can be used to correct and improve meter accuracy.

  9. Balancing of a rigid rotor using artificial neural network to predict the correction masses - DOI: 10.4025/actascitechnol.v31i2.3912

    Fábio Lúcio Santos

    2009-06-01

    Full Text Available This paper deals with an analytical model of a rigid rotor supported by hydrodynamic journal bearings where the plane separation technique together with the Artificial Neural Network (ANN is used to predict the location and magnitude of the correction masses for balancing the rotor bearing system. The rotating system is modeled by applying the rigid shaft Stodola-Green model, in which the shaft gyroscopic moments and rotatory inertia are accounted for, in conjunction with the hydrodynamic cylindrical journal bearing model based on the classical Reynolds equation. A linearized perturbation procedure is employed to render the lubrication equations from the Reynolds equation, which allows predicting the eight linear force coefficients associated with the bearing direct and cross-coupled stiffness and damping coefficients. The results show that the methodology presented is efficient for balancing rotor systems. This paper gives a step further in the monitoring process, since Artificial Neural Network is normally used to predict, not to correct the mass unbalance. The procedure presented can be used in turbo machinery industry to balance rotating machinery that require continuous inspections. Some simulated results will be used in order to clarify the methodology presented.

  10. How to Make Correct Predictions in False Belief Tasks without Attributing False Beliefs: An Analysis of Alternative Inferences and How to Avoid Them

    Ricardo Augusto Perera

    2018-04-01

    Full Text Available The use of new paradigms of false belief tasks (FBT allowed to reduce the age of children who pass the test from the previous 4 years in the standard version to only 15 months or even a striking 6 months in the nonverbal modification. These results are often taken as evidence that infants already possess an—at least implicit—theory of mind (ToM. We criticize this inferential leap on the grounds that inferring a ToM from the predictive success on a false belief task requires to assume as premise that a belief reasoning is a necessary condition for correct action prediction. It is argued that the FBT does not satisfactorily constrain the predictive means, leaving room for the use of belief-independent inferences (that can rely on the attribution of non-representational mental states or the consideration of behavioral patterns that dispense any reference to other minds. These heuristics, when applied to the FBT, can achieve the same predictive success of a belief-based inference because information provided by the test stimulus allows the recognition of particular situations that can be subsumed by their ‘laws’. Instead of solving this issue by designing a single experimentum crucis that would render unfeasible the use of non-representational inferences, we suggest the application of a set of tests in which, although individually they can support inferences dissociated from a ToM, only an inference that makes use of false beliefs is able to correctly predict all the outcomes.

  11. A predictive model of suitability for minimally invasive parathyroid surgery in the treatment of primary hyperparathyroidism [corrected].

    Kavanagh, Dara O

    2012-05-01

    Improved preoperative localizing studies have facilitated minimally invasive approaches in the treatment of primary hyperparathyroidism (PHPT). Success depends on the ability to reliably select patients who have PHPT due to single-gland disease. We propose a model encompassing preoperative clinical, biochemical, and imaging studies to predict a patient\\'s suitability for minimally invasive surgery.

  12. Inflation Basics

    Green, Dan [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-03-01

    The last few years have yielded remarkable discoveries in physics. In particle physics it appears that a fundamental scalar field exists. The Higgs boson is measured to have a mass of about 126 GeV and to have spin zero and positive parity. The Higgs field is the first fundamental scalar to be discovered in physics. The Cosmic Microwave Background, CMB, is known to have a uniform temperature to parts per 105, but has well measured fluctuations, which are thought to evolve gravitationally to provide the seeds of the current structure of the Universe. In addition, the Universe appears to contain, at present, an unknown “dark energy”, which is presently the majority energy density of the Universe, larger than either matter or radiation. This may, indeed, be a fundamental scalar field like the Higgs. “Big Bang” (BB) cosmology is a very successful “standard model” in cosmology. However, it cannot explain the uniformity of the CMB because the CMB consists of many regions not causally connected in the context of the BB model. In addition, the Universe appears to be spatially flat. However, in BB cosmology the present spatial curvature is not stable, so that the initial conditions for BB cosmology would need to be fantastically fine-tuned in order to successfully predict the presently small value of the observed curvature. These issues for BB cosmology have led to the hypothesis of “inflation” which postulates an unknown scalar field, not presumably the Higgs field or the dark energy, which causes an exponential expansion of the Universe at very early times. This attractive hypothesis can account for the problems in BB cosmology of flatness and causal CMB connectivity. In addition, the quantum fluctuations of this postulated field provide a natural explanation of the CMB fluctuations which are the seeds of the structure of galaxies. Researchers are now searching for gravitational waves imprinted on the CMB. These would be a “smoking gun” for

  13. Inflation Basics

    Green, Dan

    2014-01-01

    The last few years have yielded remarkable discoveries in physics. In particle physics it appears that a fundamental scalar field exists. The Higgs boson is measured to have a mass of about 126 GeV and to have spin zero and positive parity. The Higgs field is the first fundamental scalar to be discovered in physics. The Cosmic Microwave Background, CMB, is known to have a uniform temperature to parts per 10 5 , but has well measured fluctuations, which are thought to evolve gravitationally to provide the seeds of the current structure of the Universe. In addition, the Universe appears to contain, at present, an unknown ''dark energy'', which is presently the majority energy density of the Universe, larger than either matter or radiation. This may, indeed, be a fundamental scalar field like the Higgs. ''Big Bang'' (BB) cosmology is a very successful ''standard model'' in cosmology. However, it cannot explain the uniformity of the CMB because the CMB consists of many regions not causally connected in the context of the BB model. In addition, the Universe appears to be spatially flat. However, in BB cosmology the present spatial curvature is not stable, so that the initial conditions for BB cosmology would need to be fantastically fine-tuned in order to successfully predict the presently small value of the observed curvature. These issues for BB cosmology have led to the hypothesis of ''inflation'' which postulates an unknown scalar field, not presumably the Higgs field or the dark energy, which causes an exponential expansion of the Universe at very early times. This attractive hypothesis can account for the problems in BB cosmology of flatness and causal CMB connectivity. In addition, the quantum fluctuations of this postulated field provide a natural explanation of the CMB fluctuations which are the seeds of the structure of galaxies. Researchers are now searching for gravitational

  14. Predictive Method for Correct Identification of Archaeological Charred Grape Seeds: Support for Advances in Knowledge of Grape Domestication Process

    Ucchesu, Mariano; Orrù, Martino; Grillo, Oscar; Venora, Gianfranco; Paglietti, Giacomo; Ardu, Andrea; Bacchetta, Gianluigi

    2016-01-01

    The identification of archaeological charred grape seeds is a difficult task due to the alteration of the morphological seeds shape. In archaeobotanical studies, for the correct discrimination between Vitis vinifera subsp. sylvestris and Vitis vinifera subsp. vinifera grape seeds it is very important to understand the history and origin of the domesticated grapevine. In this work, different carbonisation experiments were carried out using a hearth to reproduce the same burning conditions that occurred in archaeological contexts. In addition, several carbonisation trials on modern wild and cultivated grape seeds were performed using a muffle furnace. For comparison with archaeological materials, modern grape seed samples were obtained using seven different temperatures of carbonisation ranging between 180 and 340ºC for 120 min. Analysing the grape seed size and shape by computer vision techniques, and applying the stepwise linear discriminant analysis (LDA) method, discrimination of the wild from the cultivated charred grape seeds was possible. An overall correct classification of 93.3% was achieved. Applying the same statistical procedure to compare modern charred with archaeological grape seeds, found in Sardinia and dating back to the Early Bronze Age (2017–1751 2σ cal. BC), allowed 75.0% of the cases to be identified as wild grape. The proposed method proved to be a useful and effective procedure in identifying, with high accuracy, the charred grape seeds found in archaeological sites. Moreover, it may be considered valid support for advances in the knowledge and comprehension of viticulture adoption and the grape domestication process. The same methodology may also be successful when applied to other plant remains, and provide important information about the history of domesticated plants. PMID:26901361

  15. Basic electrotechnology

    Ashen, R A

    2013-01-01

    BASIC Electrotechnology discusses the applications of Beginner's All-purpose Symbolic Instruction Code (BASIC) in engineering, particularly in solving electrotechnology-related problems. The book is comprised of six chapters that cover several topics relevant to BASIC and electrotechnology. Chapter 1 provides an introduction to BASIC, and Chapter 2 talks about the use of complex numbers in a.c. circuit analysis. Chapter 3 covers linear circuit analysis with d.c. and sinusoidal a.c. supplies. The book also discusses the elementary magnetic circuit theory. The theory and performance of two windi

  16. Safety, Efficacy, Predictability and Stability Indices of Photorefractive Keratectomy for Correction of Myopic Astigmatism with Plano-Scan and Tissue-Saving Algorithms

    Mehrdad Mohammadpour

    2013-10-01

    Full Text Available Purpose: To assess the safety, efficacy and predictability of photorefractive keratectomy (PRK [Tissue-saving (TS versus Plano-scan (PS ablation algorithms] of Technolas 217z excimer laser for correction of myopic astigmatismMethods: In this retrospective study one hundred and seventy eyes of 85 patients (107 eyes (62.9% with PS and 63 eyes (37.1% with TS algorithm were included. TS algorithm was applied for those with central corneal thickness less than 500 µm or estimated residual stromal thickness less than 420 µm. Mitomycin C (MMC was applied for 120 eyes (70.6%; in case of an ablation depth more than 60 μm and/or astigmatic correction more than one diopter (D. Mean sphere, cylinder, spherical equivalent (SE refraction, uncorrected visual acuity (UCVA, best corrected visual acuity (BCVA were measured preoperatively, and 4 weeks,12 weeks and 24 weeks postoperatively.Results: One, three and six months postoperatively, 60%, 92.9%, 97.5% of eyes had UCVA of 20/20 or better, respectively. Mean preoperative and 1, 3, 6 months postoperative SE were -3.48±1.28 D (-1.00 to -8.75, -0.08±0.62D, -0.02±0.57 and -0.004± 0.29, respectively. And also, 87.6%, 94.1% and 100% were within ±1.0 D of emmetropia and 68.2, 75.3, 95% were within ±0.5 of emmetropia. The safety and efficacy indices were 0.99 and 0.99 at 12 weeks and 1.009 and 0.99 at 24 weeks, respectively. There was no clinically or statistically significant difference between the outcomes of PS or TS algorithms or between those with or without MMC in either group in terms of safety, efficacy, predictability or stability. Dividing the eyes with subjective SE≤4 D and SE≥4 D postoperatively, there was no significant difference between the predictability of the two groups. There was no intra- or postoperative complication.Conclusion: Outcomes of PRK for correction of myopic astigmatism showed great promise with both PS and TS algorithms.

  17. Anesthesia Basics

    ... Staying Safe Videos for Educators Search English Español Anesthesia Basics KidsHealth / For Teens / Anesthesia Basics What's in ... español Conceptos básicos sobre la anestesia What Is Anesthesia? No doubt about it, getting an operation can ...

  18. BASIC Programming.

    Jennings, Carol Ann

    Designed for use by both secondary- and postsecondary-level business teachers, this curriculum guide consists of 10 units of instructional materials dealing with Beginners All-Purpose Symbol Instruction Code (BASIC) programing. Topics of the individual lessons are numbering BASIC programs and using the PRINT, END, and REM statements; system…

  19. Performance assessment of the commercial CFD software for the prediction of the PWR internal flow - Corrected version

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong; Cheong, Ae Ju; Kim, Do Hyeong; Kang, Min Ku

    2013-01-01

    As the computer hardware technology develops the license applicants for nuclear power plant use the commercial CFD software with the aim of reducing the excessive conservatism associated with using simplified and conservative analysis tools. Even if some of CFD software developers and its users think that a state of the art CFD software can be used to solve reasonably at least the single-phase nuclear reactor safety problems there is still the limitations and the uncertainties in the calculation result. From a regulatory perspective, Korea Institute of Nuclear Safety (KINS) has been presently conducting the performance assessment of the commercial CFD software for the nuclear reactor safety problems. In this study, in order to examine the prediction performance of the commercial CFD software with the porous model in the analysis of the scale-down APR+ (Advanced Power Reactor Plus) internal flow, simulation was conducted with the on-board numerical models in ANSYS CFX R.14 and FLUENT R.14. It was concluded that depending on the CFD software the internal flow distribution of the scale-down APR+ was locally some-what different. Although there was a limitation in estimating the prediction performance of the commercial CFD software due to the limited number of the measured data, CFXR.14 showed the more reasonable predicted results in comparison with FLUENT R.14. Meanwhile, due to the difference of discretization methodology, FLUENT R.14 required more computational memory than CFX R.14 for the same grid system. Therefore the CFD software suitable to the available computational resource should be selected for the massive parallel computation. (authors)

  20. Prolonged corrected QT interval is predictive of future stroke events even in subjects without ECG-diagnosed left ventricular hypertrophy.

    Ishikawa, Joji; Ishikawa, Shizukiyo; Kario, Kazuomi

    2015-03-01

    We attempted to evaluate whether subjects who exhibit prolonged corrected QT (QTc) interval (≥440 ms in men and ≥460 ms in women) on ECG, with and without ECG-diagnosed left ventricular hypertrophy (ECG-LVH; Cornell product, ≥244 mV×ms), are at increased risk of stroke. Among the 10 643 subjects, there were a total of 375 stroke events during the follow-up period (128.7±28.1 months; 114 142 person-years). The subjects with prolonged QTc interval (hazard ratio, 2.13; 95% confidence interval, 1.22-3.73) had an increased risk of stroke even after adjustment for ECG-LVH (hazard ratio, 1.71; 95% confidence interval, 1.22-2.40). When we stratified the subjects into those with neither a prolonged QTc interval nor ECG-LVH, those with a prolonged QTc interval but without ECG-LVH, and those with ECG-LVH, multivariate-adjusted Cox proportional hazards analysis demonstrated that the subjects with prolonged QTc intervals but not ECG-LVH (1.2% of all subjects; incidence, 10.7%; hazard ratio, 2.70, 95% confidence interval, 1.48-4.94) and those with ECG-LVH (incidence, 7.9%; hazard ratio, 1.83; 95% confidence interval, 1.31-2.57) had an increased risk of stroke events, compared with those with neither a prolonged QTc interval nor ECG-LVH. In conclusion, prolonged QTc interval was associated with stroke risk even among patients without ECG-LVH in the general population. © 2014 American Heart Association, Inc.

  1. Cation-exchanged SAPO-34 for adsorption-based hydrocarbon separations: predictions from dispersion-corrected DFT calculations.

    Fischer, Michael; Bell, Robert G

    2014-10-21

    The influence of the nature of the cation on the interaction of the silicoaluminophosphate SAPO-34 with small hydrocarbons (ethane, ethylene, acetylene, propane, propylene) is investigated using periodic density-functional theory calculations including a semi-empirical dispersion correction (DFT-D). Initial calculations are used to evaluate which of the guest-accessible cation sites in the chabazite-type structure is energetically preferred for a set of ten cations, which comprises four alkali metals (Li(+), Na(+), K(+), Rb(+)), three alkaline earth metals (Mg(2+), Ca(2+), Sr(2+)), and three transition metals (Cu(+), Ag(+), Fe(2+)). All eight cations that are likely to be found at the SII site (centre of a six-ring) are then included in the following investigation, which studies the interaction with the hydrocarbon guest molecules. In addition to the interaction energies, some trends and peculiarities regarding the adsorption geometries are analysed, and electron density difference plots obtained from the calculations are used to gain insights into the dominant interaction types. In addition to dispersion interactions, electrostatic and polarisation effects dominate for the main group cations, whereas significant orbital interactions are observed for unsaturated hydrocarbons interacting with transition metal (TM) cations. The differences between the interaction energies obtained for pairs of hydrocarbons of interest (such as ethylene-ethane and propylene-propane) deliver some qualitative insights: if this energy difference is large, it can be expected that the material will exhibit a high selectivity in the adsorption-based separation of alkene-alkane mixtures, which constitutes a problem of considerable industrial relevance. While the calculations show that TM-exchanged SAPO-34 materials are likely to exhibit a very high preference for alkenes over alkanes, the strong interaction may render an application in industrial processes impractical due to the large amount

  2. Ensemble Kalman filter assimilation of temperature and altimeter data with bias correction and application to seasonal prediction

    C. L. Keppenne

    2005-01-01

    Full Text Available To compensate for a poorly known geoid, satellite altimeter data is usually analyzed in terms of anomalies from the time mean record. When such anomalies are assimilated into an ocean model, the bias between the climatologies of the model and data is problematic. An ensemble Kalman filter (EnKF is modified to account for the presence of a forecast-model bias and applied to the assimilation of TOPEX/Poseidon (T/P altimeter data. The online bias correction (OBC algorithm uses the same ensemble of model state vectors to estimate biased-error and unbiased-error covariance matrices. Covariance localization is used but the bias covariances have different localization scales from the unbiased-error covariances, thereby accounting for the fact that the bias in a global ocean model could have much larger spatial scales than the random error.The method is applied to a 27-layer version of the Poseidon global ocean general circulation model with about 30-million state variables. Experiments in which T/P altimeter anomalies are assimilated show that the OBC reduces the RMS observation minus forecast difference for sea-surface height (SSH over a similar EnKF run in which OBC is not used. Independent in situ temperature observations show that the temperature field is also improved. When the T/P data and in situ temperature data are assimilated in the same run and the configuration of the ensemble at the end of the run is used to initialize the ocean component of the GMAO coupled forecast model, seasonal SSH hindcasts made with the coupled model are generally better than those initialized with optimal interpolation of temperature observations without altimeter data. The analysis of the corresponding sea-surface temperature hindcasts is not as conclusive.

  3. The Abdominal Aortic Aneurysm Statistically Corrected Operative Risk Evaluation (AAA SCORE) for predicting mortality after open and endovascular interventions.

    Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R

    2015-01-01

    Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  4. Basic hydraulics

    Smith, P D

    1982-01-01

    BASIC Hydraulics aims to help students both to become proficient in the BASIC programming language by actually using the language in an important field of engineering and to use computing as a means of mastering the subject of hydraulics. The book begins with a summary of the technique of computing in BASIC together with comments and listing of the main commands and statements. Subsequent chapters introduce the fundamental concepts and appropriate governing equations. Topics covered include principles of fluid mechanics; flow in pipes, pipe networks and open channels; hydraulic machinery;

  5. Basic Finance

    Vittek, J. F.

    1972-01-01

    A discussion of the basic measures of corporate financial strength, and the sources of the information is reported. Considered are: balance sheet, income statement, funds and cash flow, and financial ratios.

  6. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-01-01

    Abstract The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD). We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC. FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = −0.584 and r = −0.568, respectively, both P system can predict FFR at an optimal cut-off of <0.80, and we propose a novel application of CT-AC to MPI-IQ-SPECT for predicting clinically significant and insignificant FFR even in nonobese patients. PMID:29390486

  7. Corrections to primordial nucleosynthesis

    Dicus, D.A.; Kolb, E.W.; Gleeson, A.M.; Sudarshan, E.C.G.; Teplitz, V.L.; Turner, M.S.

    1982-01-01

    The changes in primordial nucleosynthesis resulting from small corrections to rates for weak processes that connect neutrons and protons are discussed. The weak rates are corrected by improved treatment of Coulomb and radiative corrections, and by inclusion of plasma effects. The calculations lead to a systematic decrease in the predicted 4 He abundance of about ΔY = 0.0025. The relative changes in other primoridal abundances are also 1 to 2%

  8. Safety, efficacy, and predictability of laser in situ keratomileusis to correct myopia or myopic astigmatism with a 750 Hz scanning-spot laser system.

    Tomita, Minoru; Watabe, Miyuki; Yukawa, Satoshi; Nakamura, Nobuo; Nakamura, Tadayuki; Magnago, Thomas

    2014-02-01

    To evaluate the clinical outcomes of laser in situ keratomileusis (LASIK) to correct myopia or myopic astigmatism using the Amaris 750S 750 Hz excimer laser. Private LASIK center, Tokyo, Japan. Case series. Patients with myopia or myopic astigmatism (spherical equivalent -0.50 to -11.63 diopters [D]), a corrected distance visual acuity (CDVA) of 20/20 or better, and an estimated residual bed thickness of 300 μm or more had LASIK using the aspheric aberration-free ablation profile of the 750 Hz scanning-spot laser and the Femto LDV Crystal Line femtosecond laser for flap creation. Study parameters included uncorrected distance visual acuity (UDVA), CDVA, manifest refraction, astigmatism, and higher-order aberrations (HOAs). The study included 1280 eyes (685 patients). At 3 months, 96.6% of eyes had a UDVA of 20/20 or better and 99.1% had 20/32 or better; 94.1% of eyes were within ± 0.50 D of the intended correction and 98.9% were within ± 1.00 D; 89.7% of eyes had no residual cylinder and 96.0% had a postoperative astigmatism of less than 0.50 D. All eyes had a postoperative CDVA of 20/20 or better. The HOAs increased postoperatively (PLaser in situ keratomileusis with the 750 Hz scanning-spot laser was safe, effective, and predictable. No specific clinical side effects that might be associated with a high repetition rate occurred. Mr. Magnago is an employee of Schwind eye-tech-solutions GmbH. No other author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  9. TU-G-BRA-05: Predicting Volume Change of the Tumor and Critical Structures Throughout Radiation Therapy by CT-CBCT Registration with Local Intensity Correction

    Park, S; Robinson, A; Kiess, A; Quon, H; Wong, J; Lee, J [Johns Hopkins University, Baltimore, MD (United States); Plishker, W [IGI Technologies Inc., College Park, MD (United States); Shekhar, R [IGI Technologies Inc., College Park, MD (United States); Children’s National Medical Center, Washington, D.C. (United States)

    2015-06-15

    Purpose: The purpose of this study is to develop an accurate and effective technique to predict and monitor volume changes of the tumor and organs at risk (OARs) from daily cone-beam CTs (CBCTs). Methods: While CBCT is typically used to minimize the patient setup error, its poor image quality impedes accurate monitoring of daily anatomical changes in radiotherapy. Reconstruction artifacts in CBCT often cause undesirable errors in registration-based contour propagation from the planning CT, a conventional way to estimate anatomical changes. To improve the registration and segmentation accuracy, we developed a new deformable image registration (DIR) that iteratively corrects CBCT intensities using slice-based histogram matching during the registration process. Three popular DIR algorithms (hierarchical B-spline, demons, optical flow) augmented by the intensity correction were implemented on a graphics processing unit for efficient computation, and their performances were evaluated on six head and neck (HN) cancer cases. Four trained scientists manually contoured nodal gross tumor volume (GTV) on the planning CT and every other fraction CBCTs for each case, to which the propagated GTV contours by DIR were compared. The performance was also compared with commercial software, VelocityAI (Varian Medical Systems Inc.). Results: Manual contouring showed significant variations, [-76, +141]% from the mean of all four sets of contours. The volume differences (mean±std in cc) between the average manual segmentation and four automatic segmentations are 3.70±2.30(B-spline), 1.25±1.78(demons), 0.93±1.14(optical flow), and 4.39±3.86 (VelocityAI). In comparison to the average volume of the manual segmentations, the proposed approach significantly reduced the estimation error by 9%(B-spline), 38%(demons), and 51%(optical flow) over the conventional mutual information based method (VelocityAI). Conclusion: The proposed CT-CBCT registration with local CBCT intensity correction

  10. Basic electronics

    Holbrook, Harold D

    1971-01-01

    Basic Electronics is an elementary text designed for basic instruction in electricity and electronics. It gives emphasis on electronic emission and the vacuum tube and shows transistor circuits in parallel with electron tube circuits. This book also demonstrates how the transistor merely replaces the tube, with proper change of circuit constants as required. Many problems are presented at the end of each chapter. This book is comprised of 17 chapters and opens with an overview of electron theory, followed by a discussion on resistance, inductance, and capacitance, along with their effects on t

  11. A Big Data Approach for Situation-Aware estimation, correction and prediction of aerosol effects, based on MODIS Joint Atmosphere product (collection 6) time series data

    Singh, A. K.; Toshniwal, D.

    2017-12-01

    The MODIS Joint Atmosphere product, MODATML2 and MYDATML2 L2/3 provided by LAADS DAAC (Level-1 and Atmosphere Archive & Distribution System Distributed Active Archive Center) re-sampled from medium resolution MODIS Terra /Aqua Satellites data at 5km scale, contains Cloud Reflectance, Cloud Top Temperature, Water Vapor, Aerosol Optical Depth/Thickness, Humidity data. These re-sampled data, when used for deriving climatic effects of aerosols (particularly in case of cooling effect) still exposes limitations in presence of uncertainty measures in atmospheric artifacts such as aerosol, cloud, cirrus cloud etc. The effect of uncertainty measures in these artifacts imposes an important challenge for estimation of aerosol effects, adequately affecting precise regional weather modeling and predictions: Forecasting and recommendation applications developed largely depend on these short-term local conditions (e.g. City/Locality based recommendations to citizens/farmers based on local weather models). Our approach inculcates artificial intelligence technique for representing heterogeneous data(satellite data along with air quality data from local weather stations (i.e. in situ data)) to learn, correct and predict aerosol effects in the presence of cloud and other atmospheric artifacts, defusing Spatio-temporal correlations and regressions. The Big Data process pipeline consisting correlation and regression techniques developed on Apache Spark platform can easily scale for large data sets including many tiles (scenes) and over widened time-scale. Keywords: Climatic Effects of Aerosols, Situation-Aware, Big Data, Apache Spark, MODIS Terra /Aqua, Time Series

  12. Basic concepts

    Dorner, B.

    1999-01-01

    The basic concepts of neutron scattering as a tool for studying the structure and the dynamics of condensed matter. Theoretical aspects are outlined, the two different cases of coherent and incoherent scattering are presented. The issue of resolution, coherence volume and the role of monochromators are also discussed. (K.A.)

  13. Body Basics

    ... learn more about how the body works, what basic human anatomy is, and what happens when parts of ... consult your doctor. © 1995- The Nemours Foundation. All rights reserved. Images provided by The Nemours Foundation, iStock, Getty Images, Veer, Shutterstock, and Clipart.com.

  14. Basic Thermodynamics

    Duthil, P

    2014-01-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered

  15. Basic Thermodynamics

    Duthil, P [Orsay, IPN (France)

    2014-07-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered.

  16. Ethanol Basics

    None

    2015-01-30

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  17. Application of the basic constructs of social cognitive theory for predicting mental health in student of Bushehr University Medical Sciences 2012-13

    Makyea Jamali

    2015-01-01

    Full Text Available Background: mental health is one of the health assessment topics in different communities which plays an important role in ensuring the dynamism and efficiency, especially in the students. Thus, the aim of this study is to application of basic constructs of social cognitive theory for predicting mental health in student of Bushehr University Medical Sciences. Materials and Methods: This cross– sectional study was conducted with using a systematic random sampling method in 428 students of Bushehr University Medical Sciences in 2012-13. Information was collected by using five standard questionnaires including academic self efficacy, academic stress, multidimensional social support, student outcome expectancy and Quality of life (SF-36 scales. After data collection, all data was analyzed by SPSS statistical software with using Pearson correlation coefficient test and multiple linear regressions. Results: In this study, mental health had a significant correlation with social support (P =0.000, r=0.37, academic stress (P= 0.000, r= -0.45 and academic self-efficacy (P =0.000 , r =0. 24. In the liner regression model, predictor factors of mental health were faculty type and curriculum counseling and noncurriculum counseling evaluation variables and self efficacy (P=0.031, B= 1.49, academic stress (P=0.000, B=- 4.35, and social support constructs (P=0.000, B =4.77. Also, gender, mother's education and father's job had indirect effects to mental health through social support and acceptance quota and curriculum counseling evaluation had indirect effects to mental health through self efficacy. Conclusion: Utilization of strategies to increase self- efficacy, creating social support environment and also stress reduction particularly with organization of curriculum and non-curriculum counseling sessions can promote mental health in students.

  18. Does Parental Educational Level Predict Drop-Out from Upper Secondary School for 16- to 24-Year-Olds when Basic Skills Are Accounted For? A Cross Country Comparison

    Lundetrae, Kjersti

    2011-01-01

    Drop-out from upper secondary school is considered a widespread problem, closely connected with youth unemployment. The aim of the current study was to examine whether parents' level of education predicted drop-out for 16-24-year-olds when accounting for basic skills. For this purpose, data from the Norwegian (n = 996) and American (n = 641)…

  19. Wavelet basics

    Chan, Y T

    1995-01-01

    Since the study of wavelets is a relatively new area, much of the research coming from mathematicians, most of the literature uses terminology, concepts and proofs that may, at times, be difficult and intimidating for the engineer. Wavelet Basics has therefore been written as an introductory book for scientists and engineers. The mathematical presentation has been kept simple, the concepts being presented in elaborate detail in a terminology that engineers will find familiar. Difficult ideas are illustrated with examples which will also aid in the development of an intuitive insight. Chapter 1 reviews the basics of signal transformation and discusses the concepts of duals and frames. Chapter 2 introduces the wavelet transform, contrasts it with the short-time Fourier transform and clarifies the names of the different types of wavelet transforms. Chapter 3 links multiresolution analysis, orthonormal wavelets and the design of digital filters. Chapter 4 gives a tour d'horizon of topics of current interest: wave...

  20. Education: The Basics. The Basics

    Wood, Kay

    2011-01-01

    Everyone knows that education is important, we are confronted daily by discussion of it in the media and by politicians, but how much do we really know about education? "Education: The Basics" is a lively and engaging introduction to education as an academic subject, taking into account both theory and practice. Covering the schooling system, the…

  1. Flight-Determined, Subsonic, Lateral-Directional Stability and Control Derivatives of the Thrust-Vectoring F-18 High Angle of Attack Research Vehicle (HARV), and Comparisons to the Basic F-18 and Predicted Derivatives

    Iliff, Kenneth W.; Wang, Kon-Sheng Charles

    1999-01-01

    The subsonic, lateral-directional, stability and control derivatives of the thrust-vectoring F-1 8 High Angle of Attack Research Vehicle (HARV) are extracted from flight data using a maximum likelihood parameter identification technique. State noise is accounted for in the identification formulation and is used to model the uncommanded forcing functions caused by unsteady aerodynamics. Preprogrammed maneuvers provided independent control surface inputs, eliminating problems of identifiability related to correlations between the aircraft controls and states. The HARV derivatives are plotted as functions of angles of attack between 10deg and 70deg and compared to flight estimates from the basic F-18 aircraft and to predictions from ground and wind tunnel tests. Unlike maneuvers of the basic F-18 aircraft, the HARV maneuvers were very precise and repeatable, resulting in tightly clustered estimates with small uncertainty levels. Significant differences were found between flight and prediction; however, some of these differences may be attributed to differences in the range of sideslip or input amplitude over which a given derivative was evaluated, and to differences between the HARV external configuration and that of the basic F-18 aircraft, upon which most of the prediction was based. Some HARV derivative fairings have been adjusted using basic F-18 derivatives (with low uncertainties) to help account for differences in variable ranges and the lack of HARV maneuvers at certain angles of attack.

  2. Checking the predictive accuracy of basic symptoms against ultra high-risk criteria and testing of a multivariable prediction model: Evidence from a prospective three-year observational study of persons at clinical high-risk for psychosis.

    Hengartner, M P; Heekeren, K; Dvorsky, D; Walitza, S; Rössler, W; Theodoridou, A

    2017-09-01

    The aim of this study was to critically examine the prognostic validity of various clinical high-risk (CHR) criteria alone and in combination with additional clinical characteristics. A total of 188 CHR positive persons from the region of Zurich, Switzerland (mean age 20.5 years; 60.2% male), meeting ultra high-risk (UHR) and/or basic symptoms (BS) criteria, were followed over three years. The test battery included the Structured Interview for Prodromal Syndromes (SIPS), verbal IQ and many other screening tools. Conversion to psychosis was defined according to ICD-10 criteria for schizophrenia (F20) or brief psychotic disorder (F23). Altogether n=24 persons developed manifest psychosis within three years and according to Kaplan-Meier survival analysis, the projected conversion rate was 17.5%. The predictive accuracy of UHR was statistically significant but poor (area under the curve [AUC]=0.65, Pthinking of binary at-risk criteria is necessary in order to improve the prognosis of psychotic disorders. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Basic principles

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  4. Basic electronics

    Tayal, DC

    2010-01-01

    The second edition of this book incorporates the comments and suggestions of my friends and students who have critically studied the first edition. In this edition the changes and additions have been made and subject matter has been rearranged at some places. The purpose of this text is to provide a comprehensive and up-to-date study of the principles of operation of solid state devices, their basic circuits and application of these circuits to various electronic systems, so that it can serve as a standard text not only for universities and colleges but also for technical institutes. This book

  5. Correction for the Hematocrit Bias in Dried Blood Spot Analysis Using a Nondestructive, Single-Wavelength Reflectance-Based Hematocrit Prediction Method.

    Capiau, Sara; Wilk, Leah S; De Kesel, Pieter M M; Aalders, Maurice C G; Stove, Christophe P

    2018-02-06

    bias obtained with Bland and Altman analysis was -0.015 and the limits of agreement were -0.061 and 0.031, indicating that the simplified, noncontact Hct prediction method even outperforms the original method. In addition, using caffeine as a model compound, it was demonstrated that this simplified Hct prediction method can effectively be used to implement a Hct-dependent correction factor to DBS-based results to alleviate the Hct bias.

  6. Publisher Correction

    Turcot, Valérie; Lu, Yingchang; Highland, Heather M

    2018-01-01

    In the published version of this paper, the name of author Emanuele Di Angelantonio was misspelled. This error has now been corrected in the HTML and PDF versions of the article.......In the published version of this paper, the name of author Emanuele Di Angelantonio was misspelled. This error has now been corrected in the HTML and PDF versions of the article....

  7. Author Correction

    Grundle, D S; Löscher, C R; Krahmann, G

    2018-01-01

    A correction to this article has been published and is linked from the HTML and PDF versions of this paper. The error has not been fixed in the paper.......A correction to this article has been published and is linked from the HTML and PDF versions of this paper. The error has not been fixed in the paper....

  8. Predicting adolescent problematic online game use from teacher autonomy support, basic psychological needs satisfaction, and school engagement: a 2-year longitudinal study.

    Yu, Chengfu; Li, Xian; Zhang, Wei

    2015-04-01

    Problematic online game use (POGU) has become a serious global public health concern among adolescents. However, its influencing factors and mediating mechanisms remain largely unknown. This study provides the first longitudinal design to test stage-environment fit theory empirically in POGU. A total of 356 Chinese students reported on teacher autonomy support, basic psychological needs satisfaction, school engagement, and POGU in the autumn of their 7th-9th grade years. Path analyses supported the proposed pathway: 7th grade teacher autonomy support increased 8th grade basic psychological needs satisfaction, which in turn increased 9th grade school engagement, which ultimately decreased 9th grade POGU. Furthermore, 7th grade teacher autonomy support directly increased 9th grade school engagement, which in turn decreased 9th grade POGU. These findings suggest that teacher autonomy support is an important protective predictor of adolescent POGU, and basic psychological needs satisfaction and school engagement are the primary mediators in this association.

  9. Issues and Importance of "Good" Starting Points for Nonlinear Regression for Mathematical Modeling with Maple: Basic Model Fitting to Make Predictions with Oscillating Data

    Fox, William

    2012-01-01

    The purpose of our modeling effort is to predict future outcomes. We assume the data collected are both accurate and relatively precise. For our oscillating data, we examined several mathematical modeling forms for predictions. We also examined both ignoring the oscillations as an important feature and including the oscillations as an important…

  10. Working toward Literacy in Correctional Education ESL

    Gardner, Susanne

    2014-01-01

    Correctional Education English as a Second Language (ESL) literacy programs vary from state to state, region to region. Some states enroll their correctional ESL students in adult basic education (ABE) classes; other states have separate classes and programs. At the Maryland Correctional Institution in Jessup, the ESL class is a self-contained…

  11. Forward induction reasoning and correct beliefs

    Perea y Monsuwé, Andrés

    2017-01-01

    All equilibrium concepts implicitly make a correct beliefs assumption, stating that a player believes that his opponents are correct about his first-order beliefs. In this paper we show that in many dynamic games of interest, this correct beliefs assumption may be incompatible with a very basic form

  12. Publisher Correction

    Stokholm, Jakob; Blaser, Martin J.; Thorsen, Jonathan

    2018-01-01

    The originally published version of this Article contained an incorrect version of Figure 3 that was introduced following peer review and inadvertently not corrected during the production process. Both versions contain the same set of abundance data, but the incorrect version has the children...

  13. Publisher Correction

    Flachsbart, Friederike; Dose, Janina; Gentschew, Liljana

    2018-01-01

    The original version of this Article contained an error in the spelling of the author Robert Häsler, which was incorrectly given as Robert Häesler. This has now been corrected in both the PDF and HTML versions of the Article....

  14. Correction to

    Roehle, Robert; Wieske, Viktoria; Schuetz, Georg M

    2018-01-01

    The original version of this article, published on 19 March 2018, unfortunately contained a mistake. The following correction has therefore been made in the original: The names of the authors Philipp A. Kaufmann, Ronny Ralf Buechel and Bernhard A. Herzog were presented incorrectly....

  15. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  16. Corrective Jaw Surgery

    Full Text Available ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ...

  17. Beam dynamics in rf guns and emittance correction techniques

    Serafini, L.

    1994-01-01

    In this paper we present a general review of beam dynamics in a laser-driven rf gun. The peculiarity of such an accelerating structure versus other conventional multi-cell linac structures is underlined on the basis of the Panofsky-Wenzel theorem, which is found to give a theoretical background for the well known Kim's model. A basic explanation for some proposed methods to correct rf induced emittance growth is also derived from the theorem. We also present three emittance correction techniques for the recovery of space-charge induced emittance growth, namely the optimum distributed disk-like bunch technique, the use of rf spatial harmonics to correct spherical aberration induced by space charge forces and the technique of emittance filtering by clipping the electron beam. The expected performances regarding the beam quality achievable with different techniques, as predicted by scaling laws and simulations, are analyzed, and, where available, compared to experimental results. (orig.)

  18. Do implicit motives and basic psychological needs interact to predict well-being and flow? : Testing a universal hypothesis and a matching hypothesis

    Schüler, Julia; Brandstätter, Veronika; Sheldon, Kennon M.

    2013-01-01

    Self-Determination Theory (Deci and Ryan in Intrinsic motivation and self-determination in human behavior. Plenum Press, New York, 1985) suggests that certain experiences, such as competence, are equally beneficial to everyone’s well-being (universal hypothesis), whereas Motive Disposition Theory (McClelland in Human motivation. Scott, Foresman, Glenview, IL, 1985) predicts that some people, such as those with a high achievement motive, should benefit particularly from such experiences (match...

  19. Near-infrared spectra of Penicillium camemberti strains separated by extended multiplicative signal correction improved prediction of physical and chemical variations

    Decker, Marianne; Nielsen, Per Væggemose; Martens, Harald

    2005-01-01

    signal correction (TWEMSC) preprocessing, whereby three patterns of variation in near-infrared (NIR) log(1/R) spectra of fungal colonies could be separated mathematically: (1) physical light scattering and its wavelength dependency, (2) differences in light absorption of water due to varying sample...

  20. Linear network error correction coding

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  1. From basic physics to mechanisms of toxicity: the ``liquid drop'' approach applied to develop predictive classification models for toxicity of metal oxide nanoparticles

    Sizochenko, Natalia; Rasulev, Bakhtiyor; Gajewicz, Agnieszka; Kuz'min, Victor; Puzyn, Tomasz; Leszczynski, Jerzy

    2014-10-01

    Many metal oxide nanoparticles are able to cause persistent stress to live organisms, including humans, when discharged to the environment. To understand the mechanism of metal oxide nanoparticles' toxicity and reduce the number of experiments, the development of predictive toxicity models is important. In this study, performed on a series of nanoparticles, the comparative quantitative-structure activity relationship (nano-QSAR) analyses of their toxicity towards E. coli and HaCaT cells were established. A new approach for representation of nanoparticles' structure is presented. For description of the supramolecular structure of nanoparticles the ``liquid drop'' model was applied. It is expected that a novel, proposed approach could be of general use for predictions related to nanomaterials. In addition, in our study fragmental simplex descriptors and several ligand-metal binding characteristics were calculated. The developed nano-QSAR models were validated and reliably predict the toxicity of all studied metal oxide nanoparticles. Based on the comparative analysis of contributed properties in both models the LDM-based descriptors were revealed to have an almost similar level of contribution to toxicity in both cases, while other parameters (van der Waals interactions, electronegativity and metal-ligand binding characteristics) have unequal contribution levels. In addition, the models developed here suggest different mechanisms of nanotoxicity for these two types of cells.Many metal oxide nanoparticles are able to cause persistent stress to live organisms, including humans, when discharged to the environment. To understand the mechanism of metal oxide nanoparticles' toxicity and reduce the number of experiments, the development of predictive toxicity models is important. In this study, performed on a series of nanoparticles, the comparative quantitative-structure activity relationship (nano-QSAR) analyses of their toxicity towards E. coli and HaCaT cells were

  2. Supine Lateral Bending Radiographs Predict the Initial In-brace Correction of the Providence Brace in Patients With Adolescent Idiopathic Scoliosis

    Ohrt-Nissen, Søren; Hallager, Dennis Winge; Gehrchen, Poul Martin

    2016-01-01

     ± 10°). Mean difference for thoracic curves was 0.2° (LOA ± 8°), for thoracolumbar/lumbar curves 0.9° (LOA ± 10°) and for double major curves 0.4° (LOA ± 16). CONCLUSION: SLBR provide a close estimation to the expected in-brace correction with a mean difference of less than one degree. SLRB could...

  3. Electroweak corrections

    Beenakker, W.J.P.

    1989-01-01

    The prospect of high accuracy measurements investigating the weak interactions, which are expected to take place at the electron-positron storage ring LEP at CERN and the linear collider SCL at SLAC, offers the possibility to study also the weak quantum effects. In order to distinguish if the measured weak quantum effects lie within the margins set by the standard model and those bearing traces of new physics one had to go beyond the lowest order and also include electroweak radiative corrections (EWRC) in theoretical calculations. These higher-order corrections also can offer the possibility of getting information about two particles present in the Glashow-Salam-Weinberg model (GSW), but not discovered up till now, the top quark and the Higgs boson. In ch. 2 the GSW standard model of electroweak interactions is described. In ch. 3 some special techniques are described for determination of integrals which are responsible for numerical instabilities caused by large canceling terms encountered in the calculation of EWRC effects, and methods necessary to get hold of the extensive algebra typical for EWRC. In ch. 4 various aspects related to EWRC effects are discussed, in particular the dependence of the unknown model parameters which are the masses of the top quark and the Higgs boson. The processes which are discussed are production of heavy fermions from electron-positron annihilation and those of the fermionic decay of the Z gauge boson. (H.W.). 106 refs.; 30 figs.; 6 tabs.; schemes

  4. Universality of quantum gravity corrections.

    Das, Saurya; Vagenas, Elias C

    2008-11-28

    We show that the existence of a minimum measurable length and the related generalized uncertainty principle (GUP), predicted by theories of quantum gravity, influence all quantum Hamiltonians. Thus, they predict quantum gravity corrections to various quantum phenomena. We compute such corrections to the Lamb shift, the Landau levels, and the tunneling current in a scanning tunneling microscope. We show that these corrections can be interpreted in two ways: (a) either that they are exceedingly small, beyond the reach of current experiments, or (b) that they predict upper bounds on the quantum gravity parameter in the GUP, compatible with experiments at the electroweak scale. Thus, more accurate measurements in the future should either be able to test these predictions, or further tighten the above bounds and predict an intermediate length scale between the electroweak and the Planck scale.

  5. Uncorrected and Corrected Distance Visual Acuity, Predictability, Efficacy, and Safety after Femtosecond Laser in Situ Keratomileusis (FS-LASIK) and Refractive Lenticule extraction (ReLEx) for Moderate and High Myopia

    Vestergaard, Anders; Justesen, Birgitte Larsen; Melsen, Charlotte

    Title: Uncorrected and Corrected Distance Visual Acuity, Predictability, Efficacy, and Safety after Femtosecond Laser in Situ Keratomileusis (FS-LASIK) and Refractive Lenticule extraction (ReLEx) for Moderate and High Myopia. Vestergaard A., Justesen B., Melsen C., Lyhne N., Department of Ophthal......Title: Uncorrected and Corrected Distance Visual Acuity, Predictability, Efficacy, and Safety after Femtosecond Laser in Situ Keratomileusis (FS-LASIK) and Refractive Lenticule extraction (ReLEx) for Moderate and High Myopia. Vestergaard A., Justesen B., Melsen C., Lyhne N., Department...... predictability, efficacy and safety after femtosecond LASIK (FS-LASIK) with ReLEx. Setting: Department of Ophthalmology, Odense University Hospital, Denmark. Methods: Retrospective study of results after FS-LASIK and ReLEx (including ReLEx flex, ReLEx pseudo-smile, and ReLEx smile). In total, 228 eyes were...... treated with FS-LASIK and 83 eyes with ReLEx, at the Department of Ophthalmology, Odense University Hospital in the period of April to November 2011. Only otherwise healthy myopic eyes with up to 3.00 D of astigmatism and with CDVA ≤ 0.30 (logMAR) before surgery were included in this study. FS-LASIK flaps...

  6. Quantum error correction for beginners

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  7. Stem Cell Basics

    ... Tips Info Center Research Topics Federal Policy Glossary Stem Cell Information General Information Clinical Trials Funding Information Current ... Basics » Stem Cell Basics I. Back to top Stem Cell Basics I. Introduction: What are stem cells, and ...

  8. Notes on basic materials (1)

    Donald, R.

    1976-01-01

    This lecture was a revision of basic material, intended for students who were mainly postgraduate at the end of their first year in experimental high energy physics. The subject headings include the following: notation and generalities; classification of particles; symmetry arguments; higher symmetries; SU 3 isoscalar factors; predictions from SU 3; charm; relativistic wave equations; Feynman graphical techniques. (U.K.)

  9. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve: A single-center prospective study.

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-12-01

    The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD).We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC.FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = -0.584 and r = -0.568, respectively, both P system can predict FFR at an optimal cut-off of reserved.

  10. Population Size Predicts Lexical Diversity, but so Does the Mean Sea Level --Why It Is Important to Correctly Account for the Structure of Temporal Data.

    Koplenig, Alexander; Müller-Spitzer, Carolin

    2016-01-01

    In order to demonstrate why it is important to correctly account for the (serial dependent) structure of temporal data, we document an apparently spectacular relationship between population size and lexical diversity: for five out of seven investigated languages, there is a strong relationship between population size and lexical diversity of the primary language in this country. We show that this relationship is the result of a misspecified model that does not consider the temporal aspect of the data by presenting a similar but nonsensical relationship between the global annual mean sea level and lexical diversity. Given the fact that in the recent past, several studies were published that present surprising links between different economic, cultural, political and (socio-)demographical variables on the one hand and cultural or linguistic characteristics on the other hand, but seem to suffer from exactly this problem, we explain the cause of the misspecification and show that it has profound consequences. We demonstrate how simple transformation of the time series can often solve problems of this type and argue that the evaluation of the plausibility of a relationship is important in this context. We hope that our paper will help both researchers and reviewers to understand why it is important to use special models for the analysis of data with a natural temporal ordering.

  11. Customized versus population-based growth curves: prediction of low body fat percent at term corrected gestational age following preterm birth.

    Law, Tameeka L; Katikaneni, Lakshmi D; Taylor, Sarah N; Korte, Jeffrey E; Ebeling, Myla D; Wagner, Carol L; Newman, Roger B

    2012-07-01

    Compare customized versus population-based growth curves for identification of small-for-gestational-age (SGA) and body fat percent (BF%) among preterm infants. Prospective cohort study of 204 preterm infants classified as SGA or appropriate-for-gestational-age (AGA) by population-based and customized growth curves. BF% was determined by air-displacement plethysmography. Differences between groups were compared using bivariable and multivariable linear and logistic regression analyses. Customized curves reclassified 30% of the preterm infants as SGA. SGA infants identified by customized method only had significantly lower BF% (13.8 ± 6.0) than the AGA (16.2 ± 6.3, p = 0.02) infants and similar to the SGA infants classified by both methods (14.6 ± 6.7, p = 0.51). Customized growth curves were a significant predictor of BF% (p = 0.02), whereas population-based growth curves were not a significant independent predictor of BF% (p = 0.50) at term corrected gestational age. Customized growth potential improves the differentiation of SGA infants and low BF% compared with a standard population-based growth curve among a cohort of preterm infants.

  12. Health beliefs affect the correct replacement of daily disposable contact lenses: Predicting compliance with the Health Belief Model and the Theory of Planned Behaviour.

    Livi, Stefano; Zeri, Fabrizio; Baroni, Rossella

    2017-02-01

    To assess the compliance of Daily Disposable Contact Lenses (DDCLs) wearers with replacing lenses at a manufacturer-recommended replacement frequency. To evaluate the ability of two different Health Behavioural Theories (HBT), The Health Belief Model (HBM) and The Theory of Planned Behaviour (TPB), in predicting compliance. A multi-centre survey was conducted using a questionnaire completed anonymously by contact lens wearers during the purchase of DDCLs. Three hundred and fifty-four questionnaires were returned. The survey comprised 58.5% females and 41.5% males (mean age 34±12years). Twenty-three percent of respondents were non-compliant with manufacturer-recommended replacement frequency (re-using DDCLs at least once). The main reason for re-using DDCLs was "to save money" (35%). Predictions of compliance behaviour (past behaviour or future intentions) on the basis of the two HBT was investigated through logistic regression analysis: both TPB factors (subjective norms and perceived behavioural control) were significant (pbehaviour and future intentions) and perceived benefit (only for past behaviour) as significant factors (pbehavioural control of daily replacement (behavioural control) are of paramount importance in improving compliance. With reference to the HBM, it is important to warn DDCLs wearers of the severity of a contact-lens-related eye infection, and to underline the possibility of its prevention. Copyright © 2016 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  13. [Efficacy of absorbance ratio of ELISA antibodies [corrected] for hepatitis C virus of 3th generation in the prediction of viremia evaluated by PCR].

    Vázquez-Avila, Isidro; Vera-Peralta, Jorge Manuel; Alvarez-Nemegyei, José; Rodríguez-Carvajal, Otilia

    2007-01-01

    In order to decrease the burden of suffering and the costs derived from confirmatory molecular assays, a better strategy is badly needed to decrease the rate of false positive results of the enzyme-linked immunoassay (ELISA) for detection of hepatitis C virus (HCV) antibodies (Anti). To establish the best cutoff of the S/CO rate in subjects with a positive result of a microparticule, third generation ELISA assay for Anti-HCV, for predicting viremia as detected by polymerase chain reaction (PCR) assay. Using the result of the PCR assay as "gold standard", a ROC curve was build with the results of the S/CO rate values in subjects with a positive result for ELISA HCV assay. Fifty two subjects (30 male, 22 female, 40 +/- 12.5 years old) were included. Thirty four (65.3%) had a positive RNA HCV PCR assay. The area under the curve was 0.99 (95% CI: 0.98-1.0). The optimal cutoff for the S/CO rate was established in 29: sensitivity: 97%; specificity: 100%: PPV: 100%; NPV: 94%. Setting the cutoff of the S/CO in 29 results in a high predictive value for viremia as detected by PCR in subjects with a positive ELISA HVC assay. This knowledge may result in a better decision taking for the clinical follow up of those subjects with a positive result in the ELISA screening assay for HCV infection.

  14. Low RMRratio as a surrogate marker for energy deficiency, the choice of predictive equation vital for correctly identifying male and female ballet dancers at risk

    Staal, Sarah; Sjödin, Anders Mikael; Fahrenholtz, Ida Lysdahl

    2018-01-01

    Ballet dancers are reported to have an increased risk for energy deficiency with or without disordered eating (DE) behavior. A low ratio between measured (m) and predicted (p) resting metabolic rate (RMRratio... the prevalence of suppressed RMR using different methods to calculatepRMR and to explore associations with additional markers of energy deficiency. Female (n=20) and male (n=20) professional ballet dancers, 19-35 years of age were enrolled. mRMR was assessed by respiratory calorimetry (ventilated open hood). p......% hypotension. Forty percent of females had elevated LEAF-Q score, and 50% were underweight. Suppressed RMR was associated with elevated LEAF-Q score in females and with higher training volume in males. In conclusion, professional ballet dancers are at risk for energy deficiency. The number of identified...

  15. Low RMRratio as a Surrogate Marker for Energy Deficiency, the Choice of Predictive Equation Vital for Correctly Identifying Male and Female Ballet Dancers at Risk.

    Staal, Sarah; Sjödin, Anders; Fahrenholtz, Ida; Bonnesen, Karen; Melin, Anna Katarina

    2018-06-22

    Ballet dancers are reported to have an increased risk for energy deficiency with or without disordered eating behavior. A low ratio between measured ( m ) and predicted ( p ) resting metabolic rate (RMR ratio  energy deficiency. We aimed to evaluate the prevalence of suppressed RMR using different methods to calculate p RMR and to explore associations with additional markers of energy deficiency. Female (n = 20) and male (n = 20) professional ballet dancers, 19-35 years of age, were enrolled. m RMR was assessed by respiratory calorimetry (ventilated open hood). p RMR was determined using the Cunningham and Harris-Benedict equations, and different tissue compartments derived from whole-body dual-energy X-ray absorptiometry assessment. The protocol further included assessment of body composition and bone mineral density, blood pressure, disordered eating (Eating Disorder Inventory-3), and for females, the Low Energy Availability in Females Questionnaire. The prevalence of suppressed RMR was generally high but also clearly dependent on the method used to calculate p RMR, ranging from 25% to 80% in males and 35% to 100% in females. Five percent had low bone mineral density, whereas 10% had disordered eating and 25% had hypotension. Forty percent of females had elevated Low Energy Availability in Females Questionnaire score and 50% were underweight. Suppressed RMR was associated with elevated Low Energy Availability in Females Questionnaire score in females and with higher training volume in males. In conclusion, professional ballet dancers are at risk for energy deficiency. The number of identified dancers at risk varies greatly depending on the method used to predict RMR when using RMR ratio as a marker for energy deficiency.

  16. General relativity basics and beyond

    Date, Ghanashyam

    2015-01-01

    A Broad Perspective on the Theory of General Relativity and Its Observable Implications General Relativity: Basics and Beyond familiarizes students and beginning researchers with the basic features of the theory of general relativity as well as some of its more advanced aspects. Employing the pedagogical style of a textbook, it includes essential ideas and just enough background material needed for readers to appreciate the issues and current research. Basics The first five chapters form the core of an introductory course on general relativity. The author traces Einstein’s arguments and presents examples of space-times corresponding to different types of gravitational fields. He discusses the adaptation of dynamics in a Riemannian geometry framework, the Einstein equation and its elementary properties, and different phenomena predicted or influenced by general relativity. Beyond Moving on to more sophisticated features of general relativity, the book presents the physical requirements of a well-defined de...

  17. Advanced hardware design for error correcting codes

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  18. Basic Research Firing Facility

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  19. Basic Cake Decorating Workbook.

    Bogdany, Mel

    Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…

  20. Class action litigation in correctional psychiatry.

    Metzner, Jeffrey L

    2002-01-01

    Class action litigation has been instrumental in jail and prison reform during the past two decades. Correctional mental health systems have significantly benefited from such litigation. Forensic psychiatrists have been crucial in the litigation process and the subsequent evolution of correctional mental health care systems. This article summarizes information concerning basic demographics of correctional populations and costs of correctional health care and provides a brief history of such litigation. The role of psychiatric experts, with particular reference to standards of care, is described. Specifically discussed are issues relevant to suicide prevention, the prevalence of mentally ill inmates in supermax prisons, and discharge planning.

  1. Corrections to the free-nucleon values of the single-particle matrix elements of the M1 and Gamow-Teller operators, from a comparison of shell-model predictions with sd-shell data

    Brown, B.A.; Wildenthal, B.H.

    1983-01-01

    The magnetic dipole moments of states in mirror pairs of the sd-shell nuclei and the strengths of the Gamow-Teller beta decays which connect them are compared with predictions based on mixed-configuration shell-model wave functions. From this analysis we extract the average effective values of the single-particle matrix elements of the l, s, and [Y/sup( 2 )xs]/sup( 1 ) components of the M1 and Gamow-Teller operators acting on nucleons in the 0d/sub 5/2/, 1s/sub 1/2/, and 0d/sub 3/2/ orbits. These results are compared with the recent calculations by Towner and Khanna of the corrections to the free-nucleon values of these matrix elements which arise from the effects of isobar currents, mesonic-exchange currents, and mixing with configurations outside the sd shell

  2. Can adaptive threshold-based metabolic tumor volume (MTV) and lean body mass corrected standard uptake value (SUL) predict prognosis in head and neck cancer patients treated with definitive radiotherapy/chemoradiotherapy?

    Akagunduz, Ozlem Ozkaya; Savas, Recep; Yalman, Deniz; Kocacelebi, Kenan; Esassolak, Mustafa

    2015-11-01

    To evaluate the predictive value of adaptive threshold-based metabolic tumor volume (MTV), maximum standardized uptake value (SUVmax) and maximum lean body mass corrected SUV (SULmax) measured on pretreatment positron emission tomography and computed tomography (PET/CT) imaging in head and neck cancer patients treated with definitive radiotherapy/chemoradiotherapy. Pretreatment PET/CT of the 62 patients with locally advanced head and neck cancer who were treated consecutively between May 2010 and February 2013 were reviewed retrospectively. The maximum FDG uptake of the primary tumor was defined according to SUVmax and SULmax. Multiple threshold levels between 60% and 10% of the SUVmax and SULmax were tested with intervals of 5% to 10% in order to define the most suitable threshold value for the metabolic activity of each patient's tumor (adaptive threshold). MTV was calculated according to this value. We evaluated the relationship of mean values of MTV, SUVmax and SULmax with treatment response, local recurrence, distant metastasis and disease-related death. Receiver-operating characteristic (ROC) curve analysis was done to obtain optimal predictive cut-off values for MTV and SULmax which were found to have a predictive value. Local recurrence-free (LRFS), disease-free (DFS) and overall survival (OS) were examined according to these cut-offs. Forty six patients had complete response, 15 had partial response, and 1 had stable disease 6 weeks after the completion of treatment. Median follow-up of the entire cohort was 18 months. Of 46 complete responders 10 had local recurrence, and of 16 partial or no responders 10 had local progression. Eighteen patients died. Adaptive threshold-based MTV had significant predictive value for treatment response (p=0.011), local recurrence/progression (p=0.050), and disease-related death (p=0.024). SULmax had a predictive value for local recurrence/progression (p=0.030). ROC curves analysis revealed a cut-off value of 14.00 mL for

  3. Predictive factors for neuromotor abnormalities at the corrected age of 12 months in very low birth weight premature infants Fatores preditivos para anormalidades neuromotoras aos 12 meses de idade corrigida em prematuros de muito baixo peso

    Rosane Reis de Mello

    2009-06-01

    Full Text Available BACKGROUND: The increase in survival of premature newborns has sparked growing interest in the prediction of their long-term neurodevelopment. OBJECTIVE: To estimate the incidence of neuromotor abnormalities at the corrected age of 12 months and to identify the predictive factors associated with altered neuromotor development in very low birth weight premature infants. METHOD: Cohort study. The sample included 100 premature infants. The outcome was neuromotor development at 12 months classified by Bayley Scale (PDI and neurological assessment (tonus, reflexes, posture. A multivariate logistic regression model was constructed. Neonatal variables and neuromotor abnormalities up to 6 months of corrected age were selected by bivariate analysis. RESULTS: Mean birth weight was 1126g (SD: 240. Abnormal neuromotor development was presented in 60 children at 12 months corrected age. CONCLUSION: According to the model, patients with a diagnosis including bronchopulmonary dysplasia, hypertonia of lower extremities, truncal hypotonia showed a 94.0% probability of neuromotor involvement at 12 months.INTRODUÇÃO: O aumento na sobrevida de recém-nascidos prematuros tem suscitado interesse crescente na predição do seu neurodesenvolvimento a longo prazo. OBJETIVO: Estimar a incidência de anormalidades neuromotoras aos 12 meses de idade corrigida e identificar os fatores associados ao desenvolvimento neuromotor alterado em prematuros de muito baixo peso. MÉTODO: Estudo de coorte. A amostra incluiu 100 crianças prematuras.O desfecho foi o desenvolvimento neuromotor aos 12 meses. Modelo de regressão logística multivariado foi construído. Variáveis neonatais e anormalidades neuromotoras até os 6 meses de idade corrigida foram selecionadas por análise bivariada. RESULTADOS: O peso de nascimento médio foi 1126g (DP:240. Aos 12 meses 60% das crianças apresentaram desenvolvimento neuromotor alterado. CONCLUSÃO: De acordo com o modelo, pacientes com diagn

  4. Updating the Skating Multistage Aerobic Test and Correction for V[Combining Dot Above]O2max Prediction Using a New Skating Economy Index in Elite Youth Ice Hockey Players.

    Allisse, Maxime; Bui, Hung Tien; Léger, Luc; Comtois, Alain-Steve; Leone, Mario

    2018-05-07

    Allisse, M, Bui, HT, Léger, L, Comtois, A-S, and Leone, M. Updating the skating multistage aerobic test and correction for V[Combining Dot Above]O2max prediction using a new skating economy index in elite youth ice hockey players. J Strength Cond Res XX(X): 000-000, 2018-A number of field tests, including the skating multistage aerobic test (SMAT), have been developed to predict V[Combining Dot Above]O2max in ice hockey players. The SMAT, like most field tests, assumes that participants who reach a given stage have the same oxygen uptake, which is not usually true. Thus, the objectives of this research are to update the V[Combining Dot Above]O2 values during the SMAT using a portable breath-by-breath metabolic analyzer and to propose a simple index of skating economy to improve the prediction of oxygen uptake. Twenty-six elite hockey players (age 15.8 ± 1.3 years) participated in this study. The oxygen uptake was assessed using a portable metabolic analyzer (K4b) during an on-ice maximal shuttle skate test. To develop an index of skating economy called the skating stride index (SSI), the number of skating strides was compiled for each stage of the test. The SMAT enabled the prediction of the V[Combining Dot Above]O2max (ml·kg·min) from the maximal velocity (m·s) and the SSI (skating strides·kg) using the following regression equation: V[Combining Dot Above]O2max = (14.94 × maximal velocity) + (3.68 × SSI) - 24.98 (r = 0.95, SEE = 1.92). This research allowed for the update of the oxygen uptake values of the SMAT and proposed a simple measure of skating efficiency for a more accurate evaluation of V[Combining Dot Above]O2max in elite youth hockey players. By comparing the highest and lowest observed SSI scores in our sample, it was noted that the V[Combining Dot Above]O2 values can vary by up to 5 ml·kg·min. Our results suggest that skating economy should be included in the prediction of V[Combining Dot Above]O2max to improve prediction accuracy.

  5. Prediction of Curve Correction Using Alternate Level Pedicle Screw Placement in Patients With Adolescent Idiopathic Scoliosis (AIS) Lenke 1 and 2 Using Supine Side Bending (SB) and Fulcrum Bending (FB) Radiograph.

    Kwan, Mun Keong; Zeyada, Hassan E; Chan, Chris Yin Wei

    2015-10-15

    Prospective cohort study. To compare side bending (SB) and fulcrum bending (FB) radiographs in patients with adolescent idiopathic scoliosis (AIS) and effect of magnitude and AR curves on curve correctability. The prediction of correction using side bending flexibility (SBF) and fulcrum bending flexibility (FBF) in alternate level pedicle screw (PS) configuration and effect of curve magnitude and AR curves are not well understood. 100 AIS Lenke 1 and 2 were recruited. Curve magnitude was stratified to G1 (41°-60°), G2 (61°-80°), G3 (>80°). The main thoracic (MT) curves were subclassified to AR curves [Miyanji F, Pawelek JB, Van Valin SE, et al. Is the lumbar modifier useful in surgical decision making? Defining two distinct Lenke 1A curve patterns. Spine 2008;33:2545-51]. Preoperatively SBF and FBF were determined whereas postoperative parameters were correction rate (CR), fulcrum bending correction index (FBCI), and side bending correction index (SBCI). Correlation test were carried out between SBF, FBF versus CR for the cohort. There were 38 (G1), 42 (G2), and 20 (G3) patients. 34% were AR curves. SBF for G1, G2, and G3 were 61.3 ± 14.4, 59.2 ± 16.2 and 43.1 ± 13.1% (P = 0.000) whereas FBF for G1, G2, and G3 were 71.1 ± 16.5, 58.3 ± 18.1 and 52.7 ± 17.1% (P = 0.000). The CR was G1 (74.5 ± 11.5%), G2 (69.2 ± 12.7%), and G3 (70.2 ± 8.6%). FBCI was 1.11 ± 0.3 (G1), 1.28 ± 0.4 (G2) and 1.48 ± 0.6 for G3. SBCI was 1.26 ± 0.2 (G1), 1.50 ± 0.5 (G2), and 1.72 ± 0.4 for G3. There was strong correlation for SBF and FBF versus CR for G1 and G2. For G3, a very strong correlation was established between SBF (r = 0.846, r = 0.716) and FBF versus CR (r = 0.700, r = 0.540). AR curves demonstrated higher SBF and FBF. CR remains almost constant in G1, G2, and G3. SBCI and FBCI increase significantly in G1, G2, and G3. Correlation between SBF and FBF and CR was strong for G1, G2, and very strong for G3. AR curves showed better correctability with SB and FB films.

  6. Corrective Jaw Surgery

    Full Text Available ... and Craniofacial Surgery Cleft Lip/Palate and Craniofacial Surgery A cleft lip may require one or more ... find out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment ...

  7. Basic digital signal processing

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  8. Hydromechanics - basic properties

    Lee, Sung Tak; Lee, Je Geun

    1987-03-01

    This book tells of hydromechanics, which is about basic properties of hydromechanics such as conception, definition, mass, power and weight, and perfect fluid and perfect gas, hydrostatics with summary, basic equation of hydrostatics, relative balance of hydrostatics, and kinematics of hydromechanics, description method of floating, hydromechanics about basic knowledge, equation of moment, energy equation and application of Bernoulli equation, application of momentum theory, inviscid flow and fluid measuring.

  9. Basic molecular spectroscopy

    Gorry, PA

    1985-01-01

    BASIC Molecular Spectroscopy discusses the utilization of the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in molecular spectroscopy. The book is comprised of five chapters that provide an introduction to molecular spectroscopy through programs written in BASIC. The coverage of the text includes rotational spectra, vibrational spectra, and Raman and electronic spectra. The book will be of great use to students who are currently taking a course in molecular spectroscopy.

  10. Surgical correction of postoperative astigmatism

    Lindstrom Richard

    1990-01-01

    Full Text Available The photokeratoscope has increased the understanding of the aspheric nature of the cornea as well as a better understanding of normal corneal topography. This has significantly affected the development of newer and more predictable models of surgical astigmatic correction. Relaxing incisions effectively flatten the steeper meridian an equivalent amount as they steepen the flatter meridian. The net change in spherical equivalent is, therefore, negligible. Poor predictability is the major limitation of relaxing incisions. Wedge resection can correct large degrees of postkeratoplasty astigmatism, Resection of 0.10 mm of tissue results in approximately 2 diopters of astigmatic correction. Prolonged postoperative rehabilitation and induced irregular astigmatism are limitations of the procedure. Transverse incisions flatten the steeper meridian an equivalent amount as they steepen the flatter meridian. Semiradial incisions result in two times the amount of flattening in the meridian of the incision compared to the meridian 90 degrees away. Combination of transverse incisions with semiradial incisions describes the trapezoidal astigmatic keratotomy. This procedure may correct from 5.5 to 11.0 diopters dependent upon the age of the patient. The use of the surgical keratometer is helpful in assessing a proper endpoint during surgical correction of astigmatism.

  11. Strategies for Determining Correct Cytochrome P450 Contributions in Hepatic Clearance Predictions: In Vitro-In Vivo Extrapolation as Modelling Approach and Tramadol as Proof-of Concept Compound.

    T'jollyn, Huybrecht; Snoeys, Jan; Van Bocxlaer, Jan; De Bock, Lies; Annaert, Pieter; Van Peer, Achiel; Allegaert, Karel; Mannens, Geert; Vermeulen, An; Boussery, Koen

    2017-06-01

    Although the measurement of cytochrome P450 (CYP) contributions in metabolism assays is straightforward, determination of actual in vivo contributions might be challenging. How representative are in vitro for in vivo CYP contributions? This article proposes an improved strategy for the determination of in vivo CYP enzyme-specific metabolic contributions, based on in vitro data, using an in vitro-in vivo extrapolation (IVIVE) approach. Approaches are exemplified using tramadol as model compound, and CYP2D6 and CYP3A4 as involved enzymes. Metabolism data for tramadol and for the probe substrates midazolam (CYP3A4) and dextromethorphan (CYP2D6) were gathered in human liver microsomes (HLM) and recombinant human enzyme systems (rhCYP). From these probe substrates, an activity-adjustment factor (AAF) was calculated per CYP enzyme, for the determination of correct hepatic clearance contributions. As a reference, tramadol CYP contributions were scaled-back from in vivo data (retrograde approach) and were compared with the ones derived in vitro. In this view, the AAF is an enzyme-specific factor, calculated from reference probe activity measurements in vitro and in vivo, that allows appropriate scaling of a test drug's in vitro activity to the 'healthy volunteer' population level. Calculation of an AAF, thus accounts for any 'experimental' or 'batch-specific' activity difference between in vitro HLM and in vivo derived activity. In this specific HLM batch, for CYP3A4 and CYP2D6, an AAF of 0.91 and 1.97 was calculated, respectively. This implies that, in this batch, the in vitro CYP3A4 activity is 1.10-fold higher and the CYP2D6 activity 1.97-fold lower, compared to in vivo derived CYP activities. This study shows that, in cases where the HLM pool does not represent the typical mean population CYP activities, AAF correction of in vitro metabolism data, optimizes CYP contributions in the prediction of hepatic clearance. Therefore, in vitro parameters for any test compound

  12. Finding Basic Writing's Place.

    Sheridan-Rabideau, Mary P.; Brossell, Gordon

    1995-01-01

    Posits that basic writing serves a vital function by providing writing support for at-risk students and serves the needs of a growing student population that universities accept yet feel needs additional writing instruction. Concludes that the basic writing classroom is the most effective educational support for at-risk students and their writing.…

  13. Biomass Energy Basics | NREL

    Biomass Energy Basics Biomass Energy Basics We have used biomass energy, or "bioenergy" keep warm. Wood is still the largest biomass energy resource today, but other sources of biomass can landfills (which are methane, the main component in natural gas) can be used as a biomass energy source. A

  14. Wind Energy Basics | NREL

    Wind Energy Basics Wind Energy Basics We have been harnessing the wind's energy for hundreds of grinding grain. Today, the windmill's modern equivalent-a wind turbine can use the wind's energy to most energy. At 100 feet (30 meters) or more aboveground, they can take advantage of the faster and

  15. Solar Energy Basics | NREL

    Solar Energy Basics Solar Energy Basics Solar is the Latin word for sun-a powerful source of energy that can be used to heat, cool, and light our homes and businesses. That's because more energy from the technologies convert sunlight to usable energy for buildings. The most commonly used solar technologies for

  16. Learning Visual Basic NET

    Liberty, Jesse

    2009-01-01

    Learning Visual Basic .NET is a complete introduction to VB.NET and object-oriented programming. By using hundreds of examples, this book demonstrates how to develop various kinds of applications--including those that work with databases--and web services. Learning Visual Basic .NET will help you build a solid foundation in .NET.

  17. Health Insurance Basics

    ... Staying Safe Videos for Educators Search English Español Health Insurance Basics KidsHealth / For Teens / Health Insurance Basics What's ... thought advanced calculus was confusing. What Exactly Is Health Insurance? Health insurance is a plan that people buy ...

  18. Body Basics Library

    ... Body Basics articles explain just how each body system, part, and process works. Use this medical library to find out about basic human anatomy, how ... Teeth Skin, Hair, and Nails Spleen and Lymphatic System ... Visit the Nemours Web site. Note: All information on TeensHealth® is for ...

  19. From basic needs to basic rights.

    Facio, A

    1995-06-01

    After arriving at an understanding that basic rights refer to all human needs, it is clear that a recognition of the basic needs of female humans must precede the realization of their rights. The old Women in Development (WID) framework only understood women's needs from an androcentric perspective which was limited to practical interests. Instead, women's primary need is to be free from their subordination to men. Such an understanding places all of women's immediate needs in a new light. A human rights approach to development would see women not as beneficiaries but as people entitled to enjoy the benefits of development. Discussion of what equality before the law should mean to women began at the Third World Conference on Women in Nairobi where the issue of violence against women was first linked to development. While debate continues about the distinction between civil and political rights and economic, social, and cultural rights, the realities of women's lives do not permit such a distinction. The concept of the universality of human rights did not become codified until the UN proclaimed the Universal Declaration of Human Rights in 1948. The declaration has been criticized by feminists because the view of human rights it embodies has been too strongly influenced by a liberal Western philosophy which stresses individual rights and because it is ambiguous on the distinction between human rights and the rights of a citizen. The protection of rights afforded by the Declaration, however, should not be viewed as a final achievement but as an ongoing struggle. International conferences have led to an analysis of the human-rights approach to sustainable development which concludes that women continue to face the routine denial of their rights. Each human right must be redefined from the perspective of women's needs, which must also be redefined. Women must forego challenging the concept of the universality of human rights in order to overcome the argument of cultural

  20. Basic rocks in Finland

    Piirainen, T.; Gehoer, S.; Iljina, M.; Kaerki, A.; Paakkola, J.; Vuollo, J.

    1992-10-01

    Basic igneous rocks, containing less than 52% SiO 2 , constitute an important part of the Finnish Archaean and Proterozoic crust. In the Archaean crust exist two units which contain the majority of the basic rocks. The Arcaean basic rocks are metavolcanics and situated in the Greenstone Belts of Eastern Finland. They are divided into two units. The greenstones of the lower one are tholeiites, komatiites and basaltic komatiites. The upper consists of bimodal series of volcanics and the basic rocks of which are Fe-tholeiites, basaltic komatiites and komatiites. Proterozoic basic rocks are divided into seven groups according to their ages. The Proterozoic igneous activity started by the volominous basic magmatism 2.44 Ga ago. During this stage formed the layered intrusions and related dykes in the Northern Finland. 2.2 Ga old basic rocks are situated at the margins of Karelian formations. 2.1 Ga aged Fe-tholeiitic magmatic activity is widespread in Eastern and Northern Finland. The basic rocks of 1.97 Ga age group are met within the Karelian Schist Belts as obducted ophiolite complexes but they occur also as tholeiitic diabase dykes cutting the Karelian schists and Archean basement. The intrusions and the volcanics of the 1.9 Ga old basic igneous activity are mostly encountered around the Granitoid Complex of Central Finland. Subjotnian, 1.6 Ga aged tholeiitic diabases are situated around the Rapakivi massifs of Southern Finland, and postjotnian, 1.2 Ga diabases in Western Finland where they form dykes cutting Svecofennian rocks

  1. Quantum electronics basic theory

    Fain, V M; Sanders, J H

    1969-01-01

    Quantum Electronics, Volume 1: Basic Theory is a condensed and generalized description of the many research and rapid progress done on the subject. It is translated from the Russian language. The volume describes the basic theory of quantum electronics, and shows how the concepts and equations followed in quantum electronics arise from the basic principles of theoretical physics. The book then briefly discusses the interaction of an electromagnetic field with matter. The text also covers the quantum theory of relaxation process when a quantum system approaches an equilibrium state, and explai

  2. Basic stress analysis

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  3. [Spirometry - basic examination of the lung function].

    Kociánová, Jana

    Spirometry is one of the basic internal examination methods, similarly as e.g. blood pressure measurement or ECG recording. It is used to detect or assess the extent of ventilatory disorders. Indications include respiratory symptoms or laboratory anomalies, smoking, inhalation risks and more. Its performance and evaluation should be among the basic skills of pulmonologists, internists, alergologists, pediatricians and sports physicians. The results essentially influence the correct diagnosing and treatment method. Therefore spirometry must be performed under standardized conditions and accurately and clearly assessed to enable answering clinical questions.Key words: acceptability - calibration - contraindication - evaluation - indication - parameters - spirometry - standardization.

  4. A higher twist correction to heavy quark production

    Brodsky, S.J.; Gunion, J.F.; Soper, D.E.

    1987-06-01

    The leading twist prediction for heavy quark production and a model for a higher twist correction that may be important for charm production was discussed. The correction arises from the interaction of the charm quark with spectator quarks

  5. Basic Financial Accounting

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects...

  6. HIV Treatment: The Basics

    ... AIDS Drugs Clinical Trials Apps skip to content HIV Treatment Home Understanding HIV/AIDS Fact Sheets HIV ... 4 p.m. ET) Send us an email HIV Treatment: The Basics Last Reviewed: March 22, 2018 ...

  7. Basics of SCI Rehabilitation

    Full Text Available ... How Peer Counseling Works Julie Gassaway, MS, RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children ...

  8. Powassan (POW) Virus Basics

    ... Health Professionals Related Topics For International Travelers Powassan Virus Disease Basics Download this fact sheet formatted for ... Virus Disease Fact Sheet (PDF) What is Powassan virus? Powassan virus is a tickborne flavivirus that is ...

  9. Brain Basics: Understanding Sleep

    ... You are here Home » Disorders » Patient & Caregiver Education Brain Basics: Understanding Sleep Anatomy of Sleep Sleep Stages ... t form or maintain the pathways in your brain that let you learn and create new memories, ...

  10. Basics of SCI Rehabilitation

    Full Text Available ... Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of ... injury? What is a Spinal Cord Injury? SCI Medical Experts People Living With SCI Personal Experiences By ...

  11. Basics of SCI Rehabilitation

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 David ...

  12. Basics of SCI Rehabilitation

    Full Text Available ... RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation ... Rogers, PT Recreational Therapy after Spinal Cord Injury Jennifer Piatt, PhD Kristine Cichowski, MS Read Bio Founding ...

  13. Basics of SCI Rehabilitation

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 ...

  14. Basics of SCI Rehabilitation

    Full Text Available ... Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children with Spinal Cord Injury Patricia Mucia, RN Family Life After Pediatric Spinal Injury Dawn Sheaffer, MSW Rehabilitation ...

  15. Physical Activity Basics

    ... Weight Breastfeeding Micronutrient Malnutrition State and Local Programs Physical Activity Basics Recommend on Facebook Tweet Share Compartir How much physical activity do you need? Regular physical activity helps improve ...

  16. Radionuclide Basics: Iodine

    ... Centers Radiation Protection Contact Us Share Radionuclide Basics: Iodine Iodine (chemical symbol I) is a chemical element. ... in the environment Iodine sources Iodine and health Iodine in the Environment All 37 isotopes of iodine ...

  17. Basic Finite Element Method

    Lee, Byeong Hae

    1992-02-01

    This book gives descriptions of basic finite element method, which includes basic finite element method and data, black box, writing of data, definition of VECTOR, definition of matrix, matrix and multiplication of matrix, addition of matrix, and unit matrix, conception of hardness matrix like spring power and displacement, governed equation of an elastic body, finite element method, Fortran method and programming such as composition of computer, order of programming and data card and Fortran card, finite element program and application of nonelastic problem.

  18. Development NGOs: Basic Facts

    Aldashev, Gani; Navarra, Cecilia

    2017-01-01

    This paper systematizes the results of the empirical literature on development non-governmental organizations (NGOs), drawing both from quantitative and qualitative analyses, and constructs a set of basic facts about these organizations. These basic facts concern the size of the development NGO sector and its evolution, the funding of NGOs, the allocation of NGO aid and projects across beneficiary countries, the relationship of NGOs with beneficiaries, and the phenomenon of globalization of d...

  19. Corrections Education. Washington's Community and Technical Colleges

    Washington State Board for Community and Technical Colleges, 2015

    2015-01-01

    The Washington State Department of Corrections contracts with community colleges to provide basic education and job training at each of the state's 12 adult prisons so upon release, individuals are more likely to get jobs and less likely to return. Washington State community colleges build a bridge for offenders to successfully re-enter…

  20. Readers in Adult Basic Education.

    Barnes, Adrienne E; Kim, Young-Suk; Tighe, Elizabeth L; Vorstius, Christian

    The present study explored the reading skills of a sample of 48 adults enrolled in a basic education program in northern Florida, United States. Previous research has reported on reading component skills for students in adult education settings, but little is known about eye movement patterns or their relation to reading skills for this population. In this study, reading component skills including decoding, language comprehension, and reading fluency are reported, as are eye movement variables for connected-text oral reading. Eye movement comparisons between individuals with higher and lower oral reading fluency revealed within- and between-subject effects for word frequency and word length as well as group and word frequency interactions. Bivariate correlations indicated strong relations between component skills of reading, eye movement measures, and both the Test of Adult Basic Education ( Reading subtest) and the Woodcock-Johnson III Diagnostic Reading Battery Passage Comprehension assessments. Regression analyses revealed the utility of decoding, language comprehension, and lexical activation time for predicting achievement on both the Woodcock Johnson III Passage Comprehension and the Test of Adult Basic Education Reading Comprehension.

  1. Power corrections to exclusive processes in QCD

    Mankiewicz, Lech

    2002-02-01

    In practice applicability of twist expansion crucially depends on the magnitude to power corrections to the leading-twist amplitude. I illustrate this point by considering explicit examples of two hard exclusive processes in QCD. In the case of {gamma}{sup *}{gamma} {yields} {pi}{pi} amplitude power corrections are small enough such that it should be possible to describe current experimental data by the leading-twist QCD prediction. The photon helicity-flip amplitude in DVCS on a nucleon receives large kinematical power corrections which screen the leading-twist prediction up to large values of the hard photon virtuality.

  2. NWS Corrections to Observations

    National Oceanic and Atmospheric Administration, Department of Commerce — Form B-14 is the National Weather Service form entitled 'Notice of Corrections to Weather Records.' The forms are used to make corrections to observations on forms...

  3. Corrective Jaw Surgery

    Full Text Available ... more surgeries depending on the extent of the repair needed. Click here to find out more. Corrective ... more surgeries depending on the extent of the repair needed. Click here to find out more. Corrective ...

  4. Corrective Jaw Surgery

    Full Text Available ... Jaw Surgery Download Download the ebook for further information Corrective jaw, or orthognathic surgery is performed by ... your treatment. Correction of Common Dentofacial Deformities ​ ​ The information provided here is not intended as a substitute ...

  5. Earthquake prediction

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  6. "The next Big Five Inventory (BFI-2): Developing and assessing a hierarchical model with 15 facets to enhance bandwidth, fidelity, and predictive power": Correction to Soto and John (2016).

    2017-07-01

    Reports an error in "The Next Big Five Inventory (BFI-2): Developing and Assessing a Hierarchical Model With 15 Facets to Enhance Bandwidth, Fidelity, and Predictive Power" by Christopher J. Soto and Oliver P. John ( Journal of Personality and Social Psychology , Advanced Online Publication, Apr 7, 2016, np). In the article, all citations to McCrae and Costa (2008), except for the instance in which it appears in the first paragraph of the introduction, should instead appear as McCrae and Costa (2010). The complete citation should read as follows: McCrae, R. R., & Costa, P. T. (2010). NEO Inventories professional manual. Lutz, FL: Psychological Assessment Resources. The attribution to the BFI-2 items that appears in the Table 6 note should read as follows: BFI-2 items adapted from "Conceptualization, Development, and Initial Validation of the Big Five Inventory-2," by C. J. Soto and O. P. John, 2015, Paper presented at the biennial meeting of the Association for Research in Personality. Copyright 2015 by Oliver P. John and Christopher J. Soto. The complete citation in the References list should appear as follows: Soto, C. J., & John, O. P. (2015, June). Conceptualization, development, and initial validation of the Big Five Inventory-2. Paper presented at the biennial meeting of the Association for Research in Personality, St. Louis, MO. Available from http://www.colby.edu/psych/personality-lab/ All versions of this article have been corrected. (The following abstract of the original article appeared in record 2016-17156-001.) Three studies were conducted to develop and validate the Big Five Inventory-2 (BFI-2), a major revision of the Big Five Inventory (BFI). Study 1 specified a hierarchical model of personality structure with 15 facet traits nested within the Big Five domains, and developed a preliminary item pool to measure this structure. Study 2 used conceptual and empirical criteria to construct the BFI-2 domain and facet scales from the preliminary item pool

  7. Basic Electromagnetism and Materials

    Moliton, André

    2007-01-01

    Basic Electromagnetism and Materials is the product of many years of teaching basic and applied electromagnetism. This textbook can be used to teach electromagnetism to a wide range of undergraduate science majors in physics, electrical engineering or materials science. However, by making lesser demands on mathematical knowledge than competing texts, and by emphasizing electromagnetic properties of materials and their applications, this textbook is uniquely suited to students of materials science. Many competing texts focus on the study of propagation waves either in the microwave or optical domain, whereas Basic Electromagnetism and Materials covers the entire electromagnetic domain and the physical response of materials to these waves. Professor André Moliton is Director of the Unité de Microélectronique, Optoélectronique et Polymères (Université de Limoges, France), which brings together three groups studying the optoelectronics of molecular and polymer layers, micro-optoelectronic systems for teleco...

  8. Basic properties of semiconductors

    Landsberg, PT

    2013-01-01

    Since Volume 1 was published in 1982, the centres of interest in the basic physics of semiconductors have shifted. Volume 1 was called Band Theory and Transport Properties in the first edition, but the subject has broadened to such an extent that Basic Properties is now a more suitable title. Seven chapters have been rewritten by the original authors. However, twelve chapters are essentially new, with the bulk of this work being devoted to important current topics which give this volume an almost encyclopaedic form. The first three chapters discuss various aspects of modern band theory and the

  9. Basic set theory

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  10. Comprehensive basic mathematics

    Veena, GR

    2005-01-01

    Salient Features As per II PUC Basic Mathematics syllabus of Karnataka. Provides an introduction to various basic mathematical techniques and the situations where these could be usefully employed. The language is simple and the material is self-explanatory with a large number of illustrations. Assists the reader in gaining proficiency to solve diverse variety of problems. A special capsule containing a gist and list of formulae titled ''REMEMBER! Additional chapterwise arranged question bank and 3 model papers in a separate section---''EXAMINATION CORNER''.

  11. Ecology and basic laws

    Mayer-Tasch, P.C.

    1980-01-01

    The author sketches the critical relation between ecology and basic law - critical in more than one sense. He points out the incompatibility of constitutional states and atomic states which is due to constitutional order being jeopardised by nuclear policy. He traces back the continuously rising awareness of pollution and the modern youth movement to their common root i.e. the awakening, the youth movement of the turn of the century. Eventually, he considers an economical, political, and social decentralization as a feasible alternative which would considerably relieve our basic living conditions from the threatening forms of civilization prevailing. (HSCH) [de

  12. Experimental tests and theoretical predictions for electroweak processes

    Martinelli, G.; Istituto Nazionale di Fisica Nucleare, Frascati

    1987-01-01

    In sect. 2, I will briefly recall the basic ingredients of the standard model and I will define the relevant parameters. Low-energy processes which enter into the determination of neutral-current couplings to fermions (in particular sin 2 θ W ) are presented in sect. 3. Radiative corrections to these processes are discussed in sect. 4. In sect. 5 the measurements of the W and Z 0 masses at the SPS collider are described and compared with theoretical predictions including one-loop radiative corrections. (orig./BBO)

  13. Precompound Reactions: Basic Concepts

    Weidenmueller, H. A.

    2008-01-01

    Because of the non-zero nuclear equilibration time, the compound-nucleus scattering model fails when the incident energy exceeds 10 or 20 MeV, and precompound reactions become important. Basic ideas used in the quantum-statistical approaches to these reactions are described

  14. Basic Tuberculosis Facts

    2012-03-12

    In this podcast, Dr. Kenneth Castro, Director of the Division of Tuberculosis Elimination, discusses basic TB prevention, testing, and treatment information.  Created: 3/12/2012 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP).   Date Released: 3/12/2012.

  15. Basic Exchange Rate Theories

    J.G.M. van Marrewijk (Charles)

    2005-01-01

    textabstractThis four-chapter overview of basic exchange rate theories discusses (i) the elasticity and absorption approach, (ii) the (long-run) implications of the monetary approach, (iii) the short-run effects of monetary and fiscal policy under various economic conditions, and (iv) the transition

  16. Basic SPSS tutorial

    Grotenhuis, H.F. te; Matthijssen, A.C.B.

    2015-01-01

    This supplementary book for the social, behavioral, and health sciences helps readers with no prior knowledge of IBM® SPSS® Statistics, statistics, or mathematics learn the basics of SPSS. Designed to reduce fear and build confidence, the book guides readers through point-and-click sequences using

  17. Basic Skills Assessment

    Yin, Alexander C.; Volkwein, J. Fredericks

    2010-01-01

    After surveying 1,827 students in their final year at eighty randomly selected two-year and four-year public and private institutions, American Institutes for Research (2006) reported that approximately 30 percent of students in two-year institutions and nearly 20 percent of students in four-year institutions have only basic quantitative…

  18. Basic physics for all

    Kumar, B N

    2012-01-01

    This is a simple, concise book for both student and non-physics students, presenting basic facts in straightforward form and conveying fundamental principles and theories of physics. This book will be helpful as a supplement to class teaching and to aid those who have difficulty in mastering concepts and principles.

  19. Basic pharmaceutical technology

    Angelovska, Bistra; Drakalska, Elena

    2017-01-01

    The lecture deals with basics of pharmaceutical technology as applied discipline of pharmaceutical science, whose main subject of study is formulation and manufacture of drugs. In a broad sense, pharmaceutical technology is science of formulation, preparation, stabilization and determination of the quality of medicines prepared in the pharmacy or in pharmaceutical industry

  20. Basic radiation oncology

    Beyzadeoglu, M. M.; Ebruli, C.

    2008-01-01

    Basic Radiation Oncology is an all-in-one book. It is an up-to-date bedside oriented book integrating the radiation physics, radiobiology and clinical radiation oncology. It includes the essentials of all aspects of radiation oncology with more than 300 practical illustrations, black and white and color figures. The layout and presentation is very practical and enriched with many pearl boxes. Key studies particularly randomized ones are also included at the end of each clinical chapter. Basic knowledge of all high-tech radiation teletherapy units such as tomotherapy, cyberknife, and proton therapy are also given. The first 2 sections review concepts that are crucial in radiation physics and radiobiology. The remaining 11 chapters describe treatment regimens for main cancer sites and tumor types. Basic Radiation Oncology will greatly help meeting the needs for a practical and bedside oriented oncology book for residents, fellows, and clinicians of Radiation, Medical and Surgical Oncology as well as medical students, physicians and medical physicists interested in Clinical Oncology. English Edition of the book Temel Radyasyon Onkolojisi is being published by Springer Heidelberg this year with updated 2009 AJCC Staging as Basic Radiation Oncology

  1. Bottled Water Basics

    Table of Contents Bottled water basics ....................................... pg.2 Advice for people with severely compromised immune systems (Sidebar) ............................. pg2 Know what you’re buying .............................. pg.3 Taste considerations ........................................ pg.4 Bottled water terms (Sidebar) ..................... pg.4 Begin by reading the ...

  2. Monte Carlo: Basics

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  3. Ethanol Basics (Fact Sheet)

    2015-01-01

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  4. Basic Soils. Revision.

    Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.

    This curriculum guide is designed for use in teaching a course in basic soils that is intended for college freshmen. Addressed in the individual lessons of the unit are the following topics: the way in which soil is formed, the physical properties of soil, the chemical properties of soil, the biotic properties of soil, plant-soil-water…

  5. Investigating Complexity Using Excel and Visual Basic.

    Zetie, K. P.

    2001-01-01

    Shows how some of the simple ideas in complexity can be investigated using a spreadsheet and a macro written in Visual Basic. Shows how the sandpile model of Bak, Chao, and Wiesenfeld can be simulated and animated. The model produces results that cannot easily be predicted from its properties. (Author/MM)

  6. Transportation Emissions: some basics

    Kontovas, Christos A.; Psaraftis, Harilaos N.

    2016-01-01

    transportation and especially carbon dioxide emissions are at the center stage of discussion by the world community through various international treaties, such as the Kyoto Protocol. The transportation sector also emits non-CO2 pollutants that have important effects on air quality, climate, and public health......Transportation is the backbone of international trade and a key engine driving globalization. However, there is growing concern that the Earth’s atmospheric composition is being altered by human activities, including transportation, which can lead to climate change. Air pollution from....... The main purpose of this chapter is to introduce some basic concepts that are relevant in the quest of green transportation logistics. First, we present the basics of estimating emissions from transportation activities, the current statistics and future trends, as well as the total impact of air emissions...

  7. Basic Emotions: A Reconstruction

    Mason, William A.; Capitanio, John P.

    2016-01-01

    Emotionality is a basic feature of behavior. The argument over whether the expression of emotions is based primarily on culture (constructivism, nurture) or biology (natural forms, nature) will never be resolved because both alternatives are untenable. The evidence is overwhelming that at all ages and all levels of organization, the development of emotionality is epigenetic: The organism is an active participant in its own development. To ascribe these effects to “experience” was the best that could be done for many years. With the rapid acceleration of information on how changes in organization are actually brought about, it is a good time to review, update, and revitalize our views of experience in relation to the concept of basic emotion. PMID:27110280

  8. Basic electronic circuits

    Buckley, P M

    1980-01-01

    In the past, the teaching of electricity and electronics has more often than not been carried out from a theoretical and often highly academic standpoint. Fundamentals and basic concepts have often been presented with no indication of their practical appli­ cations, and all too frequently they have been illustrated by artificially contrived laboratory experiments bearing little relationship to the outside world. The course comes in the form of fourteen fairly open-ended constructional experiments or projects. Each experiment has associated with it a construction exercise and an explanation. The basic idea behind this dual presentation is that the student can embark on each circuit following only the briefest possible instructions and that an open-ended approach is thereby not prejudiced by an initial lengthy encounter with the theory behind the project; this being a sure way to dampen enthusiasm at the outset. As the investigation progresses, questions inevitably arise. Descriptions of the phenomena encounte...

  9. Basic linear algebra

    Blyth, T S

    2002-01-01

    Basic Linear Algebra is a text for first year students leading from concrete examples to abstract theorems, via tutorial-type exercises. More exercises (of the kind a student may expect in examination papers) are grouped at the end of each section. The book covers the most important basics of any first course on linear algebra, explaining the algebra of matrices with applications to analytic geometry, systems of linear equations, difference equations and complex numbers. Linear equations are treated via Hermite normal forms which provides a successful and concrete explanation of the notion of linear independence. Another important highlight is the connection between linear mappings and matrices leading to the change of basis theorem which opens the door to the notion of similarity. This new and revised edition features additional exercises and coverage of Cramer's rule (omitted from the first edition). However, it is the new, extra chapter on computer assistance that will be of particular interest to readers:...

  10. Basics of statistical physics

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  11. Emulsion Science Basic Principles

    Leal-Calderon, Fernando; Schmitt, Véronique

    2007-01-01

    Emulsions are generally made out of two immiscible fluids like oil and water, one being dispersed in the second in the presence of surface-active compounds.They are used as intermediate or end products in a huge range of areas including the food, chemical, cosmetic, pharmaceutical, paint, and coating industries. Besides the broad domain of technological interest, emulsions are raising a variety of fundamental questions at the frontier between physics and chemistry. This book aims to give an overview of the most recent advances in emulsion science. The basic principles, covering aspects of emulsions from their preparation to their destruction, are presented in close relation to both the fundamental physics and the applications of these materials. The book is intended to help scientists and engineers in formulating new materials by giving them the basics of emulsion science.

  12. Integral marketing auditing: Basic features

    Rakić Mira

    2006-01-01

    Full Text Available Instead of emphasize the "primacy of planning" in management process marketing control should be viewed as an activity that counterbalances marketing planning and strategy, and not as a "post hoc" adjunct to the planning function. Rather, it is a separate marketing function, which by continuously checking on the validity of plans, provides time and flexibility to an organization. Therefore, in this view, marketing planning and control are counterbalances processes which are performed simultaneously. The essence of feedback control is a measurement of the actual and desired states after action has been taken, and a subsequent correction of activities. In feedforward control the activities are corrected by predicting whether current activities would lead to desired states. The single most important reason why feedforward control is different from feedback control is that its use of information is prognostic. While feedback control tries to solve problems that have occurred, feedforward control tries to discover problems waiting to occur. It is also important to distinguish between financial non-financial and multiple control. Financial control of marketing activities involves control of sales revenue, profitability and return on marketing investment (ROMI. Non-financial control of marketing activities consists of control market share, customer satisfaction, customer loyalty, brand equity and customer equity. Key components of multiple control marketing activities are control of efficiency, effectiveness and marketing audit. Marketing performance measures have moved in three consistent directions over the years: from financial to non-financial output measures, from output to input measures and from undimensional to multidimensional measures.

  13. Basics of Computer Networking

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  14. Risk communication basics

    Corrado, P.G.

    1995-01-01

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information

  15. Risk communication basics

    Corrado, P.G. [Lawrence Livermore National Laboratory, CA (United States)

    1995-12-31

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information.

  16. Basic nucleonics. 2. ed.

    Guzman, M.E.

    1989-01-01

    This book is oriented mainly towards professionals who are not physicists or experts in nuclear sciences, physicians planning to specialize in nuclear medicine or radiotherapy and technicians involved in nuclear applications. The book covers the fundamental concepts of nuclear science and technology in a simple and ordered fashion. Theory is illustrated with appropriate exercises and answers. With 17 chapters plus 3 appendices on mathematics, basic concepts are covered in: nuclear science, radioactivity, radiation and matter, nuclear reactions, X rays, shielding and radioprotection

  17. Basic of Neutron NDA

    Trahan, Alexis Chanel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives of this presentation are to introduce the basic physics of neutron production, interactions and detection; identify the processes that generate neutrons; explain the most common neutron mechanism, spontaneous and induced fission and (a,n) reactions; describe the properties of neutron from different sources; recognize advantages of neutron measurements techniques; recognize common neutrons interactions; explain neutron cross section measurements; describe the fundamental of 3He detector function and designs; and differentiate between passive and active assay techniques.

  18. Shoulder arthroscopy: the basics.

    Farmer, Kevin W; Wright, Thomas W

    2015-04-01

    Shoulder arthroscopy is a commonly performed and accepted procedure for a wide variety of pathologies. Surgeon experience, patient positioning, knowledge of surgical anatomy, proper portal placement, and proper use of instrumentation can improve technical success and minimize complication risks. This article details the surgical anatomy, indications, patient positioning, portal placement, instrumentation, and complications for basic shoulder arthroscopy. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  19. Basic accelerator optics

    CERN. Geneva. Audiovisual Unit

    1985-01-01

    A complete derivation, from first principles, of the concepts and methods applied in linear accelerator and beamline optics will be presented. Particle motion and beam motion in systems composed of linear magnets, as well as weak and strong focusing and special insertions are treated in mathematically simple terms, and design examples for magnets and systems are given. This series of five lectures is intended to provide all the basic tools required for the design and operation of beam optical systems.

  20. Basic concepts in oceanography

    Small, L.F.

    1997-01-01

    Basic concepts in oceanography include major wind patterns that drive ocean currents, and the effects that the earth's rotation, positions of land masses, and temperature and salinity have on oceanic circulation and hence global distribution of radioactivity. Special attention is given to coastal and near-coastal processes such as upwelling, tidal effects, and small-scale processes, as radionuclide distributions are currently most associated with coastal regions. (author)

  1. Basic Financial Accounting

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects......: business entities, the transformation process, types of businesses, stakeholders, legislation, the annual report, the VAT system, double-entry bookkeeping, inventories, and year-end cast flow analysis....

  2. Wall Correction Model for Wind Tunnels with Open Test Section

    Sørensen, Jens Nørkær; Shen, Wen Zhong; Mikkelsen, Robert Flemming

    2004-01-01

    , the corrections from the model are in very good agreement with the CFD computaions, demonstrating that one-dimensional momentum theory is a reliable way of predicting corrections for wall interference in wind tunnels with closed as well as open cross sections. Keywords: Wind tunnel correction, momentum theory...

  3. Electromagnetic corrections to baryon masses

    Durand, Loyal; Ha, Phuoc

    2005-01-01

    We analyze the electromagnetic contributions to the octet and decuplet baryon masses using the heavy-baryon approximation in chiral effective field theory and methods we developed in earlier analyses of the baryon masses and magnetic moments. Our methods connect simply to Morpurgo's general parametrization of the electromagnetic contributions and to semirelativistic quark models. Our calculations are carried out including the one-loop mesonic corrections to the basic electromagnetic interactions, so to two loops overall. We find that to this order in the chiral loop expansion there are no three-body contributions. The Coleman-Glashow relation and other sum rules derived in quark models with only two-body terms therefore continue to hold, and violations involve at least three-loop processes and can be expected to be quite small. We present the complete formal results and some estimates of the matrix elements here. Numerical calculations will be presented separately

  4. Assessment and correction of BCC_CSM's performance in capturing leading modes of summer precipitation over North Asia

    Gong, Zhiqiang

    2017-11-07

    This article examines the ability of Beijing Climate Center Climate System Model (BCC_CSM) in demonstrating the prediction accuracy and the leading modes of the summer precipitation over North Asia (NA). A dynamic-statistic combined approach for improving the prediction accuracy and the prediction of the leading modes of the summer precipitation over NA is proposed. Our results show that the BCC_CSM can capture part of the spatial anomaly features of the first two leading modes of NA summer precipitation. Moreover, BCC_CSM regains relationships such that the first and second mode of the empirical orthogonal function (EOF1 and EOF2) of NA summer precipitation, respectively, corresponds to the development of the El Niño and La Niña conditions in the tropical East Pacific. Nevertheless, BCC_CSM exhibits limited prediction skill over most part of NA and presents a deficiency in reproducing the EOF1\\'s and EOF2\\'s spatial pattern over central NA and EOF2\\'s interannual variability. This can be attributed as the possible reasons why the model is unable to capture the correct relationships among the basic climate elements over the central NA, lacks in its ability to reproduce a consistent zonal atmospheric pattern over NA, and has bias in predicting the relevant Sea Surface Temperature (SST) modes over the tropical Pacific and Indian Ocean regions. Based on the proposed dynamic-statistic combined correction approach, compared with the leading modes of BCC_CSM\\'s original prediction, anomaly correlation coefficients of corrected EOF1/EOF2 with the tropical Indian Ocean SST are improved from 0.18/0.36 to 0.51/0.62. Hence, the proposed correction approach suggests that the BCC_CSM\\'s prediction skill for the summer precipitation prediction over NA and its ability to capture the dominant modes could be certainly improved by choosing proper historical analogue information.

  5. Catalyst in Basic Oleochemicals

    Eva Suyenty

    2007-10-01

    Full Text Available Currently Indonesia is the world largest palm oil producer with production volume reaching 16 million tones per annum. The high crude oil and ethylene prices in the last 3 – 4 years contribute to the healthy demand growth for basic oleochemicals: fatty acids and fatty alcohols. Oleochemicals are starting to replace crude oil derived products in various applications. As widely practiced in petrochemical industry, catalyst plays a very important role in the production of basic oleochemicals. Catalytic reactions are abound in the production of oleochemicals: Nickel based catalysts are used in the hydrogenation of unsaturated fatty acids; sodium methylate catalyst in the transesterification of triglycerides; sulfonic based polystyrene resin catalyst in esterification of fatty acids; and copper chromite/copper zinc catalyst in the high pressure hydrogenation of methyl esters or fatty acids to produce fatty alcohols. To maintain long catalyst life, it is crucial to ensure the absence of catalyst poisons and inhibitors in the feed. The preparation methods of nickel and copper chromite catalysts are as follows: precipitation, filtration, drying, and calcinations. Sodium methylate is derived from direct reaction of sodium metal and methanol under inert gas. The sulfonic based polystyrene resin is derived from sulfonation of polystyrene crosslinked with di-vinyl-benzene. © 2007 BCREC UNDIP. All rights reserved.[Presented at Symposium and Congress of MKICS 2007, 18-19 April 2007, Semarang, Indonesia][How to Cite: E. Suyenty, H. Sentosa, M. Agustine, S. Anwar, A. Lie, E. Sutanto. (2007. Catalyst in Basic Oleochemicals. Bulletin of Chemical Reaction Engineering and Catalysis, 2 (2-3: 22-31.  doi:10.9767/bcrec.2.2-3.6.22-31][How to Link/DOI: http://dx.doi.org/10.9767/bcrec.2.2-3.6.22-31 || or local: http://ejournal.undip.ac.id/index.php/bcrec/article/view/6

  6. Uranium: a basic evaluation

    Crull, A.W.

    1978-01-01

    All energy sources and technologies, including uranium and the nuclear industry, are needed to provide power. Public misunderstanding of the nature of uranium and how it works as a fuel may jeopardize nuclear energy as a major option. Basic chemical facts about uranium ore and uranium fuel technology are presented. Some of the major policy decisions that must be made include the enrichment, stockpiling, and pricing of uranium. Investigations and lawsuits pertaining to uranium markets are reviewed, and the point is made that oil companies will probably have to divest their non-oil energy activities. Recommendations for nuclear policies that have been made by the General Accounting Office are discussed briefly

  7. C# Database Basics

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  8. Electrical installation calculations basic

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  9. Basic structural dynamics

    Anderson, James C

    2012-01-01

    A concise introduction to structural dynamics and earthquake engineering Basic Structural Dynamics serves as a fundamental introduction to the topic of structural dynamics. Covering single and multiple-degree-of-freedom systems while providing an introduction to earthquake engineering, the book keeps the coverage succinct and on topic at a level that is appropriate for undergraduate and graduate students. Through dozens of worked examples based on actual structures, it also introduces readers to MATLAB, a powerful software for solving both simple and complex structural d

  10. Basic heat transfer

    Bacon, D H

    2013-01-01

    Basic Heat Transfer aims to help readers use a computer to solve heat transfer problems and to promote greater understanding by changing data values and observing the effects, which are necessary in design and optimization calculations.The book is concerned with applications including insulation and heating in buildings and pipes, temperature distributions in solids for steady state and transient conditions, the determination of surface heat transfer coefficients for convection in various situations, radiation heat transfer in grey body problems, the use of finned surfaces, and simple heat exc

  11. Back to basics audio

    Nathan, Julian

    1998-01-01

    Back to Basics Audio is a thorough, yet approachable handbook on audio electronics theory and equipment. The first part of the book discusses electrical and audio principles. Those principles form a basis for understanding the operation of equipment and systems, covered in the second section. Finally, the author addresses planning and installation of a home audio system.Julian Nathan joined the audio service and manufacturing industry in 1954 and moved into motion picture engineering and production in 1960. He installed and operated recording theaters in Sydney, Austra

  12. Machine shop basics

    Miller, Rex

    2004-01-01

    Use the right tool the right wayHere, fully updated to include new machines and electronic/digital controls, is the ultimate guide to basic machine shop equipment and how to use it. Whether you're a professional machinist, an apprentice, a trade student, or a handy homeowner, this fully illustrated volume helps you define tools and use them properly and safely. It's packed with review questions for students, and loaded with answers you need on the job.Mark Richard Miller is a Professor and Chairman of the Industrial Technology Department at Texas A&M University in Kingsville, T

  13. Basic bladder neurophysiology.

    Clemens, J Quentin

    2010-11-01

    Maintenance of normal lower urinary tract function is a complex process that requires coordination between the central nervous system and the autonomic and somatic components of the peripheral nervous system. This article provides an overview of the basic principles that are recognized to regulate normal urine storage and micturition, including bladder biomechanics, relevant neuroanatomy, neural control of lower urinary tract function, and the pharmacologic processes that translate the neural signals into functional results. Finally, the emerging role of the urothelium as a sensory structure is discussed. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Basic research projects

    1979-04-01

    The research programs under the cognizance of the Office of Energy Research (OER) are directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical and biological sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall DOE effort and the fundamental principles of natural phenomena so that these phenomena may be understood, and new principles, formulated. The DOE-OER outlay activities include three major programs: High Energy Physics, Nuclear Physics, and Basic Energy Sciences. Taken together, these programs represent some 30 percent of the Nation's Federal support of basic research in the energy sciences. The research activities of OER involve more than 6,000 scientists and engineers working in some 17 major Federal Research Centers and at more than 135 different universities and industrial firms throughout the United States. Contract holders in the areas of high-energy physics, nuclear physics, materials sciences, nuclear science, chemical sciences, engineering, mathematics geosciences, advanced energy projects, and biological energy research are listed. Funding trends for recent years are outlined

  15. Basic scattering theory

    Queen, N.M.

    1978-01-01

    This series of lectures on basic scattering theory were given as part of a course for postgraduate high energy physicists and were designed to acquaint the student with some of the basic language and formalism used for the phenomenological description of nuclear reactions and decay processes used for the study of elementary particle interactions. Well established and model independent aspects of scattering theory, which are the basis of S-matrix theory, are considered. The subject is considered under the following headings; the S-matrix, cross sections and decay rates, phase space, relativistic kinematics, the Mandelstam variables, the flux factor, two-body phase space, Dalitz plots, other kinematic plots, two-particle reactions, unitarity, the partial-wave expansion, resonances (single-channel case), multi-channel resonances, analyticity and crossing, dispersion relations, the one-particle exchange model, the density matrix, mathematical properties of the density matrix, the density matrix in scattering processes, the density matrix in decay processes, and the helicity formalism. Some exercises for the students are included. (U.K.)

  16. Basic and clinical immunology

    Chinen, Javier; Shearer, William T.

    2003-01-01

    Progress in immunology continues to grow exponentially every year. New applications of this knowledge are being developed for a broad range of clinical conditions. Conversely, the study of primary and secondary immunodeficiencies is helping to elucidate the intricate mechanisms of the immune system. We have selected a few of the most significant contributions to the fields of basic and clinical immunology published between October 2001 and October 2002. Our choice of topics in basic immunology included the description of T-bet as a determinant factor for T(H)1 differentiation, the role of the activation-induced cytosine deaminase gene in B-cell development, the characterization of CD4(+)CD25(+) regulatory T cells, and the use of dynamic imaging to study MHC class II transport and T-cell and dendritic cell membrane interactions. Articles related to clinical immunology that were selected for review include the description of immunodeficiency caused by caspase 8 deficiency; a case series report on X-linked agammaglobulinemia; the mechanism of action, efficacy, and complications of intravenous immunoglobulin; mechanisms of autoimmunity diseases; and advances in HIV pathogenesis and vaccine development. We also reviewed two articles that explore the possible alterations of the immune system caused by spaceflights, a new field with increasing importance as human space expeditions become a reality in the 21st century.

  17. Achieving year 2000 readiness: basic processes

    1999-03-01

    This document provides an approach or addressing safety and operability concerns related to Year 2000 (Y2K). Although it was prepared for nuclear power plants the methods described are applicable to other nuclear installations and to other industrial concerns. The basic goal was to provide a brief but comprehensive approach that may be used to discover, understand and correct the Y2K related problems. The document relies on certain basic expectations of the facility that would apply to any programme: ownership, management, knowledgeable participants, thorough application of the approach, documentation of efforts, quality assurance of products and compliance with all regulatory requirements. The IAEA has and will continue to be involved with Member States to assist them in implementing this document and achieving Y2K Readiness

  18. Achieving year 2000 readiness: basic processes

    NONE

    1999-03-01

    This document provides an approach or addressing safety and operability concerns related to Year 2000 (Y2K). Although it was prepared for nuclear power plants the methods described are applicable to other nuclear installations and to other industrial concerns. The basic goal was to provide a brief but comprehensive approach that may be used to discover, understand and correct the Y2K related problems. The document relies on certain basic expectations of the facility that would apply to any programme: ownership, management, knowledgeable participants, thorough application of the approach, documentation of efforts, quality assurance of products and compliance with all regulatory requirements. The IAEA has and will continue to be involved with Member States to assist them in implementing this document and achieving Y2K Readiness 12 refs, 3 figs

  19. Basics and application of PSpice

    Choi, Pyeong; Cho, Yong Beom; Mok, Hyeong Su; Baek, Dong CHeol

    2006-03-01

    This book is comprised of nineteenth chapters, which introduces basics and application of PSpice. The contents of this book are PSpice?, PSpice introduction, PSpice simulation, DC analysis, parametric analysis, Transient analysis, parametric analysis and measurements, Monte Carlo analysis, changing of device characteristic, ABM application. The elementary laws of circuit, R.L.C. basic circuit, Diode basic cc circuit, Transistor and EET basic circuit, OP-Amp basic circuit, Digital basic circuit, Analog, digital circuit practice, digital circuit application and practice and ABM circuit application and practice.

  20. Correction of Neonatal Hypovolemia

    V. V. Moskalev

    2007-01-01

    Full Text Available Objective: to evaluate the efficiency of hydroxyethyl starch solution (6% refortane, Berlin-Chemie versus fresh frozen plasma used to correct neonatal hypovolemia.Materials and methods. In 12 neonatal infants with hypoco-agulation, hypovolemia was corrected with fresh frozen plasma (10 ml/kg body weight. In 13 neonates, it was corrected with 6% refortane infusion in a dose of 10 ml/kg. Doppler echocardiography was used to study central hemodynamic parameters and Doppler study was employed to examine regional blood flow in the anterior cerebral and renal arteries.Results. Infusion of 6% refortane and fresh frozen plasma at a rate of 10 ml/hour during an hour was found to normalize the parameters of central hemodynamics and regional blood flow.Conclusion. Comparative analysis of the findings suggests that 6% refortane is the drug of choice in correcting neonatal hypovolemia. Fresh frozen plasma should be infused in hemostatic disorders. 

  1. Corrective Jaw Surgery

    Full Text Available ... surgery. It is important to understand that your treatment, which will probably include orthodontics before and after ... to realistically estimate the time required for your treatment. Correction of Common Dentofacial Deformities ​ ​ The information provided ...

  2. Corrective Jaw Surgery

    Full Text Available ... misalignment of jaws and teeth. Surgery can improve chewing, speaking and breathing. While the patient's appearance may ... indicate the need for corrective jaw surgery: Difficulty chewing, or biting food Difficulty swallowing Chronic jaw or ...

  3. Corrective Jaw Surgery

    Full Text Available ... It can also invite bacteria that lead to gum disease. Click here to find out more. Who We ... It can also invite bacteria that lead to gum disease. Click here to find out more. Corrective Jaw ...

  4. Corrective Jaw Surgery

    Full Text Available ... is performed by an oral and maxillofacial surgeon (OMS) to correct a wide range of minor and ... when sleeping, including snoring) Your dentist, orthodontist and OMS will work together to determine whether you are ...

  5. ESPlannerBASIC CANADA

    Laurence Kotlikoff

    2015-02-01

    Full Text Available Traditional financial planning is based on a fundamental rule of thumb: Aim to save enough for retirement to replace 80 per cent of your pre-retirement income with income from pensions and assets. Millions of Canadians follow this formula. Yet, there is no guarantee this approach is consistent with a savings plan that will allow them to experience their optimal standard of living — given their income — throughout their working lives. Consumption smoothing happens when a consumer projects her income and her non-discretionary expenses (such as mortgage payments all the way up until the end of her life, and is able to determine her household discretionary spending power over time, to achieve the smoothest living standard path possible without going into debt. When consumption smoothing is calculated accurately, a person’s lifestyle should be roughly the same whether she is in her 30s with small children, in her 50s with kids in college, or in retirement, with adult children. Consumption smoothing allows that to happen. But while it is conceptually straightforward, consumption smoothing requires the use of advanced numerical techniques. Now, Canadian families have access to a powerful consumption-smoothing tool: ESPlannerBASIC Canada. This free, secure and confidential online tool will allow Canadian families to safely and securely enter their earnings and other financial resources and will calculate for them how much they can spend and how much they should save in order to maintain their lifestyle from now until they die, without going into debt. It will also calculate how much life insurance they should buy, to ensure that household living standards are not affected after a family member dies. Users can easily and instantly run “what-if” scenarios to see how retiring early (or later, changing jobs, adjusting retirement contributions, having children, moving homes, timing RRSP withdrawals, and other financial and lifestyle decisions would

  6. ICT: isotope correction toolbox.

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Flatland optics. II. Basic experiments.

    Lohmann, A W; Wang, D; Pe'er, A; Friesem, A A

    2001-05-01

    In "Flatland optics: fundamentals" [J. Opt. Soc. Am. A 17, 1755 (2000)] we described the basic principles of two-dimensional (2D) optics and showed that a wavelength lambda in three-dimensional (3D) space (x,y,z) may appear in Flatland (x,z) as a wave with another wavelength, lambda = lambda/cosalpha. The tilt angle alpha can be modified by a 3D (Spaceland) individual who then is able to influence the 2D optics in a way that must appear to be magical to 2D Flatland individuals-in the spirit of E. A. Abbott's science fiction story [Flatland, a Romance of Many Dimensions, 6th ed. (Dover, New York, 1952)] of 1884. We now want to establish the reality or objectivity of the 2D wavelength lambda by some basic experiments similar to those that demonstrated roughly 200 years ago the wave nature of light. Specifically, we describe how to measure the 2D wavelength lambda by mean of five different arrangements that involve Young's biprism configuration, Talbot's self-imaging effect, measuring the focal length of a Fresnel zone plate, and letting light be diffracted by a double slit and by a grating. We also performed experiments with most of these arrangements. The results reveal that the theoretical wavelength, as predicted by our Flatland optics theory, does indeed coincide with the wavelength lambda as measured by Flatland experiments. Finally, we present an alternative way to understand Flatland optics in the spatial frequency domains of Flatland and Spaceland.

  8. Cloud computing basics

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  9. Basic semiconductor physics

    Hamaguchi, Chihiro

    2017-01-01

    This book presents a detailed description of basic semiconductor physics. The text covers a wide range of important phenomena in semiconductors, from the simple to the advanced. Four different methods of energy band calculations in the full band region are explained: local empirical pseudopotential, non-local pseudopotential, KP perturbation and tight-binding methods. The effective mass approximation and electron motion in a periodic potential, Boltzmann transport equation and deformation potentials used for analysis of transport properties are discussed. Further, the book examines experiments and theoretical analyses of cyclotron resonance in detail. Optical and transport properties, magneto-transport, two-dimensional electron gas transport (HEMT and MOSFET) and quantum transport are reviewed, while optical transition, electron-phonon interaction and electron mobility are also addressed. Energy and electronic structure of a quantum dot (artificial atom) are explained with the help of Slater determinants. The...

  10. Basic category theory

    Leinster, Tom

    2014-01-01

    At the heart of this short introduction to category theory is the idea of a universal property, important throughout mathematics. After an introductory chapter giving the basic definitions, separate chapters explain three ways of expressing universal properties: via adjoint functors, representable functors, and limits. A final chapter ties all three together. The book is suitable for use in courses or for independent study. Assuming relatively little mathematical background, it is ideal for beginning graduate students or advanced undergraduates learning category theory for the first time. For each new categorical concept, a generous supply of examples is provided, taken from different parts of mathematics. At points where the leap in abstraction is particularly great (such as the Yoneda lemma), the reader will find careful and extensive explanations. Copious exercises are included.

  11. Energy the basics

    Schobert, Harold

    2013-01-01

    People rarely stop to think about where the energy they use to power their everyday lives comes from and when they do it is often to ask a worried question: is mankind's energy usage killing the planet? How do we deal with nuclear waste? What happens when the oil runs out? Energy: The Basics answers these questions but it also does much more. In this engaging yet even-handed introduction, readers are introduced to: the concept of 'energy' and what it really means the ways energy is currently generated and the sources used new and emerging energy technologies such as solar power and biofuels the impacts of energy use on the environment including climate change Featuring explanatory diagrams, tables, a glossary and an extensive further reading list, this book is the ideal starting point for anyone interested in the impact and future of the world's energy supply.

  12. Basic ionizing physic radiation

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    To become an expert in this field, radiographer must first master in radiation physics. That why the second chapter discussed on radiation physic. The topic that must covered such as atom and molecule, atomic structure, proton, isotope, half life, types of radiation and some basic formula such as formula for shielding, half life, half value layer, tenth value layer and more. All of this must be mastered by radiographer if they want to know more detail on this technique because this technique was a combination of theory and practical. Once they failed the theory they cannot go further on this technique. And to master this technique, once cannot depend on theory only. So, for this technique theory and practical must walk together.

  13. 15. Basic economic indicators

    Carless, J.; Dow, B.; Farivari, R.; O'Connor, J.; Fox, T.; Tunstall, D.; Mentzingen, M.

    1992-01-01

    The clear value of economic data and analysis to decisionmakers has motivated them to mandate the creation of extensive global economic data sets. This chapter contains a set of these basic economic data, which provides the context for understanding the causes and the consequences of many of the decisions that affect the world's resources. Many traditional economic indicators fail to account for the depletion or deterioration of natural resources, the long-term consequences of such depletion, the equitable distribution of income within a country, or the sustainability of current economic practices. The type of measurement shown here, however, is still useful in showing the great differences between the wealthiest and the poorest countries. Tables are given on the following: Gross national product and official development assistance 1969-89; External debt indicators 1979-89; Central government expenditures; and World commodity indexes and prices 1975-89

  14. Chernobyl versus Basic Law

    Sauer, G W

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH).

  15. Basic engineering mathematics

    Bird, John

    2014-01-01

    Introductory mathematics written specifically for students new to engineering Now in its sixth edition, Basic Engineering Mathematics is an established textbook that has helped thousands of students to succeed in their exams. John Bird's approach is based on worked examples and interactive problems. This makes it ideal for students from a wide range of academic backgrounds as the student can work through the material at their own pace. Mathematical theories are explained in a straightforward manner, being supported by practical engineering examples and applications in order to ensure that readers can relate theory to practice. The extensive and thorough topic coverage makes this an ideal text for introductory level engineering courses. This title is supported by a companion website with resources for both students and lecturers, including lists of essential formulae, multiple choice tests, full solutions for all 1,600 further questions contained within the practice exercises, and biographical information on t...

  16. Chernobyl versus Basic Law?

    Sauer, G.W.

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH) [de

  17. Basic real analysis

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  18. Magnetism basics and applications

    Stefanita, Carmen-Gabriela

    2012-01-01

    This textbook is aimed at engineering students who are likely to come across magnetics applications in their professional practice. Whether designing lithography equipment containing ferromagnetic brushes, or detecting defects in aeronautics, some basic knowledge of 21st century magnetism is needed. From the magnetic tape on the pocket credit card to the read head in a personal computer, people run into magnetism in many products. Furthermore, in a variety of disciplines tools of the trade exploit magnetic principles, and many interdisciplinary laboratory research areas cross paths with magnetic phenomena that may seem mysterious to the untrained mind. Therefore, this course offers a broad coverage of magnetism topics encountered more often in this millenium, revealing key concepts on which many practical applications rest. Some traditional subjects in magnetism are discussed in the first half of the book, followed by areas likely to spark the curiosity of those more interested in today’s technological achi...

  19. Atomic Basic Blocks

    Scheler, Fabian; Mitzlaff, Martin; Schröder-Preikschat, Wolfgang

    Die Entscheidung, einen zeit- bzw. ereignisgesteuerten Ansatz für ein Echtzeitsystem zu verwenden, ist schwierig und sehr weitreichend. Weitreichend vor allem deshalb, weil diese beiden Ansätze mit äußerst unterschiedlichen Kontrollflussabstraktionen verknüpft sind, die eine spätere Migration zum anderen Paradigma sehr schwer oder gar unmöglich machen. Wir schlagen daher die Verwendung einer Zwischendarstellung vor, die unabhängig von der jeweils verwendeten Kontrollflussabstraktion ist. Für diesen Zweck verwenden wir auf Basisblöcken basierende Atomic Basic Blocks (ABB) und bauen darauf ein Werkzeug, den Real-Time Systems Compiler (RTSC) auf, der die Migration zwischen zeit- und ereignisgesteuerten Systemen unterstützt.

  20. Corrective Justice vs. Social Justice in the Aftermath of War

    Pablo Kalmanovitz

    2010-11-01

    Full Text Available How do we justify the practice of corrective justice for losses suffered during armed conflicts? This article seeks to show the force and relevance of this question, and to argue that, in cases of massively destructive wars, social justice should gain priority over corrective justice. Starting from a liberal Rawlsian conception of the relationship between corrective and social justice, it is argued that, paradoxically, the more destructive a war is, the less normative force corrective rights have and the higher priority policies of social justice, which guarantee basic rights to all citizens, should have.

  1. Basic Phage Mathematics.

    Abedon, Stephen T; Katsaounis, Tena I

    2018-01-01

    Basic mathematical descriptions are useful in phage ecology, applied phage ecology such as in the course of phage therapy, and also toward keeping track of expected phage-bacterial interactions as seen during laboratory manipulation of phages. The most basic mathematical descriptor of phages is their titer, that is, their concentration within stocks, experimental vessels, or other environments. Various phenomena can serve to modify phage titers, and indeed phage titers can vary as a function of how they are measured. An important aspect of how changes in titers can occur results from phage interactions with bacteria. These changes tend to vary in degree as a function of bacterial densities within environments, and particularly densities of those bacteria that are susceptible to or at least adsorbable by a given phage type. Using simple mathematical models one can describe phage-bacterial interactions that give rise particularly to phage adsorption events. With elaboration one can consider changes in both phage and bacterial densities as a function of both time and these interactions. In addition, phages along with their impact on bacteria can be considered as spatially constrained processes. In this chapter we consider the simpler of these concepts, providing in particular detailed verbal explanations toward facile mathematical insight. The primary goal is to stimulate a more informed use and manipulation of phages and phage populations within the laboratory as well as toward more effective phage application outside of the laboratory, such as during phage therapy. More generally, numerous issues and approaches to the quantification of phages are considered along with the quantification of individual, ecological, and applied properties of phages.

  2. Basic Energy Sciences at NREL

    Moon, S.

    2000-01-01

    NREL's Center for Basic Sciences performs fundamental research for DOE's Office of Science. Our mission is to provide fundamental knowledge in the basic sciences and engineering that will underpin new and improved renewable energy technologies

  3. Geological Corrections in Gravimetry

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  4. Significant and Basic Innovations in Urban Planning

    Kolyasnikov, V. A.

    2017-11-01

    The article considers the development features of the innovative urban planning in the USSR and Russia in XVIII - XX centuries. Innovative urban planning is defined as an activity on innovations creation and their implementation to obtain a socio-economic, political, environmental or other effect. In the course of urban development history this activity represents a cyclic wave process in which there are phases of rise and fall. The study of cyclic waves in the development of innovative urban planning uses the concept of basic and epochal innovations selection. This concept was developed by scientists for the study of cyclic wave processes in economics. Its adaptation to the conditions of innovative urban planning development allows one to introduce the concept of “basic innovation” and “significant innovation” in the theory and practice of settlement formation and their systems as well as to identify opportunities to highlight these innovations in the history of Russian urban planning. From these positions, six innovation waves committed to the urban development over the past 300 years are being investigated. The observed basic innovations in the domestic urban area show that urban development is a vital area for ensuring the country’s geopolitical security. Basic innovations are translated in time and modernized under new conditions of urban planning development. In this regard, we can predict the development of four basic innovations in post-Soviet Russia.

  5. Unrenormalizable theories can be predictive

    Kubo, J

    2003-01-01

    Unrenormalizable theories contain infinitely many free parameters. Considering these theories in terms of the Wilsonian renormalization group (RG), we suggest a method for removing this large ambiguity. Our basic assumption is the existence of a maximal ultraviolet cutoff in a cutoff theory, and we require that the theory be so fine tuned as to reach the maximal cutoff. The theory so obtained behaves as a local continuum theory to the shortest distance. In concrete examples of the scalar theory we find that at least in a certain approximation to the Wilsonian RG, this requirement enables us to make unique predictions in the infrared regime in terms of a finite number of independent parameters. Therefore, this method might provide a way for calculating quantum corrections in a low-energy effective theory of quantum gravity. (orig.)

  6. Performance Evaluation of Blind Tropospheric Delay correction ...

    lekky

    and Temperature 2 wet (GPT2w) models) for tropospheric delay correction, ... In practice, a user often employs a certain troposphere model based on the popularity ... comparisons between some of the models have been carried out in the past for .... prediction of meteorological parameter values, which are then used to ...

  7. Position Error Covariance Matrix Validation and Correction

    Frisbee, Joe, Jr.

    2016-01-01

    In order to calculate operationally accurate collision probabilities, the position error covariance matrices predicted at times of closest approach must be sufficiently accurate representations of the position uncertainties. This presentation will discuss why the Gaussian distribution is a reasonable expectation for the position uncertainty and how this assumed distribution type is used in the validation and correction of position error covariance matrices.

  8. Possibly Large Corrections to the Inflationary Observables

    Bartolo, N

    2008-01-01

    We point out that the theoretical predictions for the inflationary observables may be generically altered by the presence of fields which are heavier than the Hubble rate during inflation and whose dynamics is usually neglected. They introduce corrections which may be easily larger than both the second-order contributions in the slow-roll parameters and the accuracy expected in the forthcoming experiments.

  9. Robust Active Label Correction

    Kremer, Jan; Sha, Fei; Igel, Christian

    2018-01-01

    for the noisy data lead to different active label correction algorithms. If loss functions consider the label noise rates, these rates are estimated during learning, where importance weighting compensates for the sampling bias. We show empirically that viewing the true label as a latent variable and computing......Active label correction addresses the problem of learning from input data for which noisy labels are available (e.g., from imprecise measurements or crowd-sourcing) and each true label can be obtained at a significant cost (e.g., through additional measurements or human experts). To minimize......). To select labels for correction, we adopt the active learning strategy of maximizing the expected model change. We consider the change in regularized empirical risk functionals that use different pointwise loss functions for patterns with noisy and true labels, respectively. Different loss functions...

  10. Generalised Batho correction factor

    Siddon, R.L.

    1984-01-01

    There are various approximate algorithms available to calculate the radiation dose in the presence of a heterogeneous medium. The Webb and Fox product over layers formulation of the generalised Batho correction factor requires determination of the number of layers and the layer densities for each ray path. It has been shown that the Webb and Fox expression is inefficient for the heterogeneous medium which is expressed as regions of inhomogeneity rather than layers. The inefficiency of the layer formulation is identified as the repeated problem of determining for each ray path which inhomogeneity region corresponds to a particular layer. It has been shown that the formulation of the Batho correction factor as a product over inhomogeneity regions avoids that topological problem entirely. The formulation in terms of a product over regions simplifies the computer code and reduces the time required to calculate the Batho correction factor for the general heterogeneous medium. (U.K.)

  11. THE SECONDARY EXTINCTION CORRECTION

    Zachariasen, W. H.

    1963-03-15

    It is shown that Darwin's formula for the secondary extinction correction, which has been universally accepted and extensively used, contains an appreciable error in the x-ray diffraction case. The correct formula is derived. As a first order correction for secondary extinction, Darwin showed that one should use an effective absorption coefficient mu + gQ where an unpolarized incident beam is presumed. The new derivation shows that the effective absorption coefficient is mu + 2gQ(1 + cos/sup 4/2 theta )/(1 plus or minus cos/sup 2/2 theta )/s up 2/, which gives mu + gQ at theta =0 deg and theta = 90 deg , but mu + 2gQ at theta = 45 deg . Darwin's theory remains valid when applied to neutron diffraction. (auth)

  12. Basic concepts of epidemiology

    Savitz, D.A.

    1984-01-01

    Epidemiology can be defined simply as the science of the distribution and determinants of disease in human populations. As a descriptive tool, epidemiology can aid health care service providers, for example, in allocation of resources. In its analytic capacity, the epidemiologic approach can help identify determinants of disease through the study of human populations. Epidemiology is primarily an observational rather than experimental methodology, with corresponding strengths and limitations. Relative to other approaches for assessing disease etiology and impacts of potential health hazards, epidemiology has a rather unique role that is complementary to, but independent of, both basic biologic sciences and clinical medicine. Experimental biologic sciences such as toxicology and physiology provide critical information on biologic mechanisms of disease required for causal inference. Clinical medicine often serves as the warning system that provides etiologic clues to be pursued through systematic investigation. The advantage of the epidemiologic approach is its reliance on human field experience, that is, the real world. While laboratory experimentation is uniquely well suited to defining potential hazards, it can neither determine whether human populations have actually been affected nor quantify that effect. Building all the complexities of human behavior and external factors into a laboratory study or mathematical model is impossible. By studying the world as it exists, epidemiology examines the integrated, summarized product of the myriad factors influencing health

  13. Basic operator theory

    Gohberg, Israel

    2001-01-01

    rii application of linear operators on a Hilbert space. We begin with a chapter on the geometry of Hilbert space and then proceed to the spectral theory of compact self adjoint operators; operational calculus is next presented as a nat­ ural outgrowth of the spectral theory. The second part of the text concentrates on Banach spaces and linear operators acting on these spaces. It includes, for example, the three 'basic principles of linear analysis and the Riesz­ Fredholm theory of compact operators. Both parts contain plenty of applications. All chapters deal exclusively with linear problems, except for the last chapter which is an introduction to the theory of nonlinear operators. In addition to the standard topics in functional anal­ ysis, we have presented relatively recent results which appear, for example, in Chapter VII. In general, in writ­ ing this book, the authors were strongly influenced by re­ cent developments in operator theory which affected the choice of topics, proofs and exercises. One ...

  14. Basics of aerothermodynamics

    Hirschel, Ernst Heinrich

    2015-01-01

    This successful book gives an introduction to the basics of aerothermodynamics, as applied in particular to winged re-entry vehicles and airbreathing hypersonic cruise and acceleration vehicles. The book gives a review of the issues of transport of momentum, energy and mass, real-gas effects as well as inviscid and viscous flow phenomena. In this second, revised edition the chapters with the classical topics of aerothermodynamics more or less were left untouched. The access to some single topics of practical interest was improved. Auxiliary chapters were put into an appendix. The recent successful flights of the X-43A and the X-51A indicate that the dawn of sustained airbreathing hypersonic flight now has arrived. This proves that the original approach of the book to put emphasis on viscous effects and the aerothermodynamics of radiation-cooled vehicle surfaces was timely. This second, revised edition even more accentuates these topics. A new, additional chapter treats examples of viscous thermal surface eff...

  15. Nanodesign: some basic questions

    Schommers, Wolfram

    2013-01-01

    There is no doubt that nanoscience will be the dominant direction for technology in this century, and that this science will influence our lives to a large extent as well as open completely new perspectives on all scientific and technological disciplines. To be able to produce optimal nanosystems with tailor-made properties, it is necessary to analyze and construct such systems in advance by adequate theoretical and computational methods. Since we work in nanoscience and nanotechnology at the ultimate level, we have to apply the basic laws of physics. What methods and tools are relevant here? The book gives an answer to this question. The background of the theoretical methods and tools is critically discussed, and also the world view on which these physical laws are based. Such a debate is not only of academic interest but is of highly general concern, and this is because we constantly move in nanoscience and nanotechnology between two extreme poles, between infinite life and total destruction . On the one ...

  16. Basic Data on Biogas

    NONE

    2012-07-01

    Renewable gases such as biogas and biomethane are considered as key energy carrier when the society is replacing fossil fuels with renewable alternatives. In Sweden, almost 80 % of the fossil fuels are used in the transport sector. Therefore, the focus in Sweden has been to use the produced biogas in this sector as vehicle gas. Basic Data on Biogas contains an overview of production, utilisation, climate effects etc. of biogas from a Swedish perspective. The purpose is to give an easy overview of the current situation in Sweden for politicians, decision makers and interested public. 1.4 TWh of biogas is produced annually in Sweden at approximately 230 facilities. The 135 wastewater treatment plants that produce biogas contribute with around half of the production. In order to reduce the sludge volume, biogas has been produced at wastewater treatment plants for decades. New biogas plants are mainly co-digestion plants and farm plants. The land filling of organic waste has been banned since 2005, thus the biogas produced in landfills is decreasing.

  17. Remediating Remediation: From Basic Writing to Writing across the Curriculum

    Faulkner, Melissa

    2013-01-01

    This article challenges faculty members and administrators to rethink current definitions of remediation. First year college students are increasingly placed into basic writing courses due to a perceived inability to use English grammar correctly, but it must be acknowledged that all students will encounter the need for remediation as they attempt…

  18. Basic Energy Sciences Program Update

    None, None

    2016-01-04

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) supports fundamental research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels to provide the foundations for new energy technologies and to support DOE missions in energy, environment, and national security. The research disciplines covered by BES—condensed matter and materials physics, chemistry, geosciences, and aspects of physical biosciences— are those that discover new materials and design new chemical processes. These disciplines touch virtually every aspect of energy resources, production, conversion, transmission, storage, efficiency, and waste mitigation. BES also plans, constructs, and operates world-class scientific user facilities that provide outstanding capabilities for imaging and spectroscopy, characterizing materials of all kinds ranging from hard metals to fragile biological samples, and studying the chemical transformation of matter. These facilities are used to correlate the microscopic structure of materials with their macroscopic properties and to study chemical processes. Such experiments provide critical insights to electronic, atomic, and molecular configurations, often at ultrasmall length and ultrafast time scales.

  19. Bryant J. correction formula

    Tejera R, A.; Cortes P, A.; Becerril V, A.

    1990-03-01

    For the practical application of the method proposed by J. Bryant, the authors carried out a series of small corrections, related with the bottom, the dead time of the detectors and channels, with the resolution time of the coincidences, with the accidental coincidences, with the decay scheme and with the gamma efficiency of the beta detector beta and the beta efficiency beta of the gamma detector. The calculation of the correction formula is presented in the development of the present report, being presented 25 combinations of the probability of the first existent state at once of one disintegration and the second state at once of the following disintegration. (Author)

  20. Model Correction Factor Method

    Christensen, Claus; Randrup-Thomsen, Søren; Morsing Johannesen, Johannes

    1997-01-01

    The model correction factor method is proposed as an alternative to traditional polynomial based response surface techniques in structural reliability considering a computationally time consuming limit state procedure as a 'black box'. The class of polynomial functions is replaced by a limit...... of the model correction factor method, is that in simpler form not using gradient information on the original limit state function or only using this information once, a drastic reduction of the number of limit state evaluation is obtained together with good approximations on the reliability. Methods...

  1. Correction procedures for C-14 dates

    McKerrell, H.

    1975-01-01

    There are two quite separate criteria to satisfy before accepting as valid the corrections to C-14 dates which have been indicated for some years now by the bristlecone pine calibration. Firstly the correction figures have to be based upon all the available tree-ring data and derived in a manner that is mathematically sound, and secondly the correction figures have to produce accurate results on C-14 dates from archaeological test samples of known historical date, these covering as wide a period as possible. Neither of these basic prerequisites has yet been fully met. Thus the two-fold purpose of this paper is to bring together, and to compare with an independently based procedure, the various correction curves or tables that have been published up to Spring 1974, as well as to detail the correction results on reliable, historically dated Egyptian, Helladic and Minoan test samples from 3100 B.C. The nomenclature followed is strictly that adopted by the primary dating journal Radiocarbon, all C-14 dates quoted thus relate to the 5568 year half-life and the standard AD/BC system. (author)

  2. Attenuation correction for SPECT

    Hosoba, Minoru

    1986-01-01

    Attenuation correction is required for the reconstruction of a quantitative SPECT image. A new method for detecting body contours, which are important for the correction of tissue attenuation, is presented. The effect of body contours, detected by the newly developed method, on the reconstructed images was evaluated using various techniques for attenuation correction. The count rates in the specified region of interest in the phantom image by the Radial Post Correction (RPC) method, the Weighted Back Projection (WBP) method, Chang's method were strongly affected by the accuracy of the contours, as compared to those by Sorenson's method. To evaluate the effect of non-uniform attenuators on the cardiac SPECT, computer simulation experiments were performed using two types of models, the uniform attenuator model (UAM) and the non-uniform attenuator model (NUAM). The RPC method showed the lowest relative percent error (%ERROR) in UAM (11 %). However, 20 to 30 percent increase in %ERROR was observed for NUAM reconstructed with the RPC, WBP, and Chang's methods. Introducing an average attenuation coefficient (0.12/cm for Tc-99m and 0.14/cm for Tl-201) in the RPC method decreased %ERROR to the levels for UAM. Finally, a comparison between images, which were obtained by 180 deg and 360 deg scans and reconstructed from the RPC method, showed that the degree of the distortion of the contour of the simulated ventricles in the 180 deg scan was 15 % higher than that in the 360 deg scan. (Namekawa, K.)

  3. Text Induced Spelling Correction

    Reynaert, M.W.C.

    2004-01-01

    We present TISC, a language-independent and context-sensitive spelling checking and correction system designed to facilitate the automatic removal of non-word spelling errors in large corpora. Its lexicon is derived from a very large corpus of raw text, without supervision, and contains word

  4. Ballistic deficit correction

    Duchene, G.; Moszynski, M.; Curien, D.

    1991-01-01

    The EUROGAM data-acquisition has to handle a large number of events/s. Typical in-beam experiments using heavy-ion fusion reactions assume the production of about 50 000 compound nuclei per second deexciting via particle and γ-ray emissions. The very powerful γ-ray detection of EUROGAM is expected to produce high-fold event rates as large as 10 4 events/s. Such high count rates introduce, in a common dead time mode, large dead times for the whole system associated with the processing of the pulse, its digitization and its readout (from the preamplifier pulse up to the readout of the information). In order to minimize the dead time the shaping time constant τ, usually about 3 μs for large volume Ge detectors has to be reduced. Smaller shaping times, however, will adversely affect the energy resolution due to ballistic deficit. One possible solution is to operate the linear amplifier, with a somewhat smaller shaping time constant (in the present case we choose τ = 1.5 μs), in combination with a ballistic deficit compensator. The ballistic deficit can be corrected in different ways using a Gated Integrator, a hardware correction or even a software correction. In this paper we present a comparative study of the software and hardware corrections as well as gated integration

  5. Correctness of concurrent processes

    E.R. Olderog (Ernst-Rüdiger)

    1989-01-01

    textabstractA new notion of correctness for concurrent processes is introduced and investigated. It is a relationship P sat S between process terms P built up from operators of CCS [Mi 80], CSP [Ho 85] and COSY [LTS 79] and logical formulas S specifying sets of finite communication sequences as in

  6. Error Correcting Codes -34 ...

    information and coding theory. A large scale relay computer had failed to deliver the expected results due to a hardware fault. Hamming, one of the active proponents of computer usage, was determined to find an efficient means by which computers could detect and correct their own faults. A mathematician by train-.

  7. Measured attenuation correction methods

    Ostertag, H.; Kuebler, W.K.; Doll, J.; Lorenz, W.J.

    1989-01-01

    Accurate attenuation correction is a prerequisite for the determination of exact local radioactivity concentrations in positron emission tomography. Attenuation correction factors range from 4-5 in brain studies to 50-100 in whole body measurements. This report gives an overview of the different methods of determining the attenuation correction factors by transmission measurements using an external positron emitting source. The long-lived generator nuclide 68 Ge/ 68 Ga is commonly used for this purpose. The additional patient dose from the transmission source is usually a small fraction of the dose due to the subsequent emission measurement. Ring-shaped transmission sources as well as rotating point or line sources are employed in modern positron tomographs. By masking a rotating line or point source, random and scattered events in the transmission scans can be effectively suppressed. The problems of measured attenuation correction are discussed: Transmission/emission mismatch, random and scattered event contamination, counting statistics, transmission/emission scatter compensation, transmission scan after administration of activity to the patient. By using a double masking technique simultaneous emission and transmission scans become feasible. (orig.)

  8. Corrective Jaw Surgery

    Full Text Available ... their surgery, orthognathic surgery is performed to correct functional problems. Jaw Surgery can have a dramatic effect on many aspects of life. Following are some of the conditions that may ... front, or side Facial injury Birth defects Receding lower jaw and ...

  9. Error Correcting Codes

    successful consumer products of all time - the Compact Disc. (CD) digital audio .... We can make ... only 2 t additional parity check symbols are required, to be able to correct t .... display information (contah'ling music related data and a table.

  10. Error Correcting Codes

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  11. Error Correcting Codes

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  12. 10. Correctness of Programs

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. Algorithms - Correctness of Programs. R K Shyamasundar. Series Article Volume 3 ... Author Affiliations. R K Shyamasundar1. Computer Science Group, Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India.

  13. DOE-2 basics

    1991-08-01

    DOE-2 provides the building design and research communities with an up-to-date, unbiased, well-documented public-domain computer program for building energy analysis. DOE-2 predicts the hourly energy use and energy cost of a building given hourly weather information and a description of the building and its HVAC equipment and utility rate structure. DOE-2 is a portable FORTRAN program that can be used on a large variety of computers, including PC's. Using DOE-2, designers can determine the choice of building parameters that improve energy efficiency while maintaining thermal comfort. The purpose of DOE-2 is to aid in the analysis of energy usage in buildings; it is not intended to be the sole source of information relied upon for the design of buildings. The judgment and experience of the architect/engineer still remain the most important elements of building design.

  14. DOE-2 basics

    1991-08-01

    DOE-2 provides the building design and research communities with an up-to-date, unbiased, well-documented public-domain computer program for building energy analysis. DOE-2 predicts the hourly energy use and energy cost of a building given hourly weather information and a description of the building and its HVAC equipment and utility rate structure. DOE-2 is a portable FORTRAN program that can be used on a large variety of computers, including PC`s. Using DOE-2, designers can determine the choice of building parameters that improve energy efficiency while maintaining thermal comfort. The purpose of DOE-2 is to aid in the analysis of energy usage in buildings; it is not intended to be the sole source of information relied upon for the design of buildings. The judgment and experience of the architect/engineer still remain the most important elements of building design.

  15. Corrected ROC analysis for misclassified binary outcomes.

    Zawistowski, Matthew; Sussman, Jeremy B; Hofer, Timothy P; Bentley, Douglas; Hayward, Rodney A; Wiitala, Wyndy L

    2017-06-15

    Creating accurate risk prediction models from Big Data resources such as Electronic Health Records (EHRs) is a critical step toward achieving precision medicine. A major challenge in developing these tools is accounting for imperfect aspects of EHR data, particularly the potential for misclassified outcomes. Misclassification, the swapping of case and control outcome labels, is well known to bias effect size estimates for regression prediction models. In this paper, we study the effect of misclassification on accuracy assessment for risk prediction models and find that it leads to bias in the area under the curve (AUC) metric from standard ROC analysis. The extent of the bias is determined by the false positive and false negative misclassification rates as well as disease prevalence. Notably, we show that simply correcting for misclassification while building the prediction model is not sufficient to remove the bias in AUC. We therefore introduce an intuitive misclassification-adjusted ROC procedure that accounts for uncertainty in observed outcomes and produces bias-corrected estimates of the true AUC. The method requires that misclassification rates are either known or can be estimated, quantities typically required for the modeling step. The computational simplicity of our method is a key advantage, making it ideal for efficiently comparing multiple prediction models on very large datasets. Finally, we apply the correction method to a hospitalization prediction model from a cohort of over 1 million patients from the Veterans Health Administrations EHR. Implementations of the ROC correction are provided for Stata and R. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  16. A quantitative comparison of corrective and perfective maintenance

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  17. Coastal Zone Color Scanner atmospheric correction - Influence of El Chichon

    Gordon, Howard R.; Castano, Diego J.

    1988-01-01

    The addition of an El Chichon-like aerosol layer in the stratosphere is shown to have very little effect on the basic CZCS atmospheric correction algorithm. The additional stratospheric aerosol is found to increase the total radiance exiting the atmosphere, thereby increasing the probability that the sensor will saturate. It is suggested that in the absence of saturation the correction algorithm should perform as well as in the absence of the stratospheric layer.

  18. Visual Basic 2012 programmer's reference

    Stephens, Rod

    2012-01-01

    The comprehensive guide to Visual Basic 2012 Microsoft Visual Basic (VB) is the most popular programming language in the world, with millions of lines of code used in businesses and applications of all types and sizes. In this edition of the bestselling Wrox guide, Visual Basic expert Rod Stephens offers novice and experienced developers a comprehensive tutorial and reference to Visual Basic 2012. This latest edition introduces major changes to the Visual Studio development platform, including support for developing mobile applications that can take advantage of the Windows 8 operating system

  19. Intestinal Permeability: The Basics

    Ingvar Bjarnason

    1995-01-01

    Full Text Available The authors review some of the more fundamental principles underlying the noninvasive assessment of intestinal permeability in humans, the choice of test markers and their analyses, and the practical aspects of test dose composition and how these can be changed to allow the specific assessment of regional permeability changes and other intestinal functions. The implications of increased intestinal permeability in the pathogenesis of human disease is discussed in relation to findings in patients with Crohn’s disease. A common feature of increased intestinal permeability is the development of a low grade enteropathy, and while quantitatively similar changes may be found in Crohn’s disease these seem to predict relapse of disease. Moreover, factors associated with relapse of Crohn’s disease have in common an action to increase intestinal permeability. While increased intestinal permeability does not seem to be important in the etiology of Crohn’s disease it may be a central mechanism in the clinical relapse of disease.

  20. On the Limitations of Variational Bias Correction

    Moradi, Isaac; Mccarty, Will; Gelaro, Ronald

    2018-01-01

    Satellite radiances are the largest dataset assimilated into Numerical Weather Prediction (NWP) models, however the data are subject to errors and uncertainties that need to be accounted for before assimilating into the NWP models. Variational bias correction uses the time series of observation minus background to estimate the observations bias. This technique does not distinguish between the background error, forward operator error, and observations error so that all these errors are summed up together and counted as observation error. We identify some sources of observations errors (e.g., antenna emissivity, non-linearity in the calibration, and antenna pattern) and show the limitations of variational bias corrections on estimating these errors.

  1. Infinite-degree-corrected stochastic block model

    Herlau, Tue; Schmidt, Mikkel Nørgaard; Mørup, Morten

    2014-01-01

    In stochastic block models, which are among the most prominent statistical models for cluster analysis of complex networks, clusters are defined as groups of nodes with statistically similar link probabilities within and between groups. A recent extension by Karrer and Newman [Karrer and Newman...... corrected stochastic block model as a nonparametric Bayesian model, incorporating a parameter to control the amount of degree correction that can then be inferred from data. Additionally, our formulation yields principled ways of inferring the number of groups as well as predicting missing links...

  2. Correction of refractive errors

    Vladimir Pfeifer

    2005-10-01

    Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.

  3. PS Booster Orbit Correction

    Chanel, M; Rumolo, G; Tomás, R; CERN. Geneva. AB Department

    2008-01-01

    At the end of the 2007 run, orbit measurements were carried out in the 4 rings of the PS Booster (PSB) for different working points and beam energies. The aim of these measurements was to provide the necessary input data for a PSB realignment campaign during the 2007/2008 shutdown. Currently, only very few corrector magnets can be operated reliably in the PSB; therefore the orbit correction has to be achieved by displacing (horizontally and vertically) and/or tilting some of the defocusing quadrupoles (QDs). In this report we first describe the orbit measurements, followed by a detailed explanation of the orbit correction strategy. Results and conclusions are presented in the last section.

  4. Error-correction coding

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  5. Electroweak corrections to H->ZZ/WW->4 leptons

    Bredenstein, A.; Denner, A.; Dittmaier, S.; Weber, M.M.

    2006-01-01

    We provide predictions for the decays H->ZZ->4-bar and H->WW->4-bar including the complete electroweak O(α) corrections and improvements by higher-order final-state radiation and two-loop corrections proportional to G μ 2 M H 4 . The gauge-boson resonances are described in the complex-mass scheme. We find corrections at the level of 1-8% for the partial widths

  6. Basic Energy Sciences at NREL

    Moon, S.

    2000-12-04

    NREL's Center for Basic Sciences performs fundamental research for DOE's Office of Science. Our mission is to provide fundamental knowledge in the basic sciences and engineering that will underpin new and improved renewable energy technologies.

  7. BASIC Instructional Program: System Documentation.

    Dageforde, Mary L.

    This report documents the BASIC Instructional Program (BIP), a "hands-on laboratory" that teaches elementary programming in the BASIC language, as implemented in the MAINSAIL language, a machine-independent revision of SAIL which should facilitate implementation of BIP on other computing systems. Eight instructional modules which make up…

  8. Solar Photovoltaic Technology Basics | NREL

    Photovoltaic Technology Basics Solar Photovoltaic Technology Basics Solar cells, also called found in sand) created an electric charge when exposed to sunlight. Soon solar cells were being used to power space satellites and smaller items like calculators and watches. Photo of a large silicon solar

  9. Solar Process Heat Basics | NREL

    Process Heat Basics Solar Process Heat Basics Commercial and industrial buildings may use the same solar technologies-photovoltaics, passive heating, daylighting, and water heating-that are used for residential buildings. These nonresidential buildings can also use solar energy technologies that would be

  10. Basics of LASIK Eye Surgery

    ... Vea esta página en español The Basics of LASIK Eye Surgery Share This Page Facebook Twitter Linked- ... Surgery Surgical Alternatives to LASIK For More Information  LASIK Basics If you wear glasses or contact lenses, ...

  11. Fuel Cell Vehicle Basics | NREL

    Fuel Cell Vehicle Basics Fuel Cell Vehicle Basics Researchers are developing fuel cells that can be silver four-door sedan being driven on a roadway and containing the words "hydrogen fuel cell electric" across the front and rear doors. This prototype hydrogen fuel cell electric vehicle was

  12. Children and Their Basic Needs.

    Prince, Debra Lindsey; Howard, Esther M.

    2002-01-01

    Describes obstacles presented by poverty in the fulfillment of the basic needs of children. Individually addresses Maslow's five basic needs with regard to children reared in poverty: (1) physiological needs; (2) safety needs; (3) belonging and love needs; (4) self-esteem needs; and (5) self-actualization needs. (Author/SD)

  13. Accurately Detecting Students' Lies regarding Relational Aggression by Correctional Instructions

    Dickhauser, Oliver; Reinhard, Marc-Andre; Marksteiner, Tamara

    2012-01-01

    This study investigates the effect of correctional instructions when detecting lies about relational aggression. Based on models from the field of social psychology, we predict that correctional instruction will lead to a less pronounced lie bias and to more accurate lie detection. Seventy-five teachers received videotapes of students' true denial…

  14. Measurement and correction of accelerator optics

    Zimmerman, F.

    1998-06-01

    This report reviews procedures and techniques for measuring, correcting and controlling various optics parameters of an accelerator, including the betatron tune, beta function, betatron coupling, dispersion, chromaticity, momentum compaction factor, and beam orbit. The techniques described are not only indispensable for the basic set-up of an accelerator, but in addition the same methods can be used to study more esoteric questions as, for instance, dynamic aperture limitations or wakefield effects. The different procedures are illustrated by examples from several accelerators, storage rings, as well as linacs and transport lines

  15. Geometrical E-beam proximity correction for raster scan systems

    Belic, Nikola; Eisenmann, Hans; Hartmann, Hans; Waas, Thomas

    1999-04-01

    High pattern fidelity is a basic requirement for the generation of masks containing sub micro structures and for direct writing. Increasing needs mainly emerging from OPC at mask level and x-ray lithography require a correction of the e-beam proximity effect. The most part of e-beam writers are raster scan system. This paper describes a new method for geometrical pattern correction in order to provide a correction solution for e-beam system that are not able to apply variable doses.

  16. Brain Image Motion Correction

    Jensen, Rasmus Ramsbøl; Benjaminsen, Claus; Larsen, Rasmus

    2015-01-01

    The application of motion tracking is wide, including: industrial production lines, motion interaction in gaming, computer-aided surgery and motion correction in medical brain imaging. Several devices for motion tracking exist using a variety of different methodologies. In order to use such devices...... offset and tracking noise in medical brain imaging. The data are generated from a phantom mounted on a rotary stage and have been collected using a Siemens High Resolution Research Tomograph for positron emission tomography. During acquisition the phantom was tracked with our latest tracking prototype...

  17. Basic entwinements: unassuming analogue inserts in basic digital modeling (courses)

    Wiesner, Thomas

    2012-01-01

    Ubiquitous, basic digital modelling tools are currently deployed with relative ease in architecture schools during the course of first year studies. While these first architectural projects essays sometimes communicate matter with already quite impressive professional outlooks, a certain disparit...

  18. Basic dynamics at a multiple resonance

    Ferraz-Mello, S.; Yokoyama, T.

    The problem of multiple resonance is dealt with as it occurs in Celestial Mechanics and in non-linear Mechanics. In perturbation theory small divisors occur as a consequence of the fact that the flows in the phase space of the real system and the flows in the phase space of the so-called undisturbed system are not homeomorphic at all. Whatever the perturbation technique we adopt, the first step is to correct the topology of the undisturbed flows. It is shown that at a multiple resonance we are led to dynamical systems that are generally non-integrable. The basic representatives of these systems are the n-pendulums theta sup(:) sub(k) = σ sub(j)A sub(jk) sin theta sub(j). Multiple resonances are classified as syndetic or asyndetic following the eigenvalues of a quadratic form. Some degenerate cases are also presented. (Author) [pt

  19. Discipline and Grievance Procedures: Juvenile Detention and Correctional Facilities.

    Illinois Univ., Champaign. Community Research Center.

    The purpose of sound disciplinary practices and grievance procedures in juvenile detention and correctional facilities is outlined and a philosophy on discipline and grievance procedures is discussed. The use of secure confinement or restriction as a means of treatment, and the effects of restriction are considered. The basics of good discipline…

  20. Predictive Maintenance (PdM) Centralization for Significant Energy Savings

    Smith, Dale

    2010-09-15

    Cost effective predictive maintenance (PdM) technologies and basic energy calculations can mine energy savings form processes or maintenance activities. Centralizing and packaging this information correctly empowers facility maintenance and reliability professionals to build financial justification and support for strategies and personnel to weather global economic downturns and competition. Attendees will learn how to: Systematically build a 'pilot project' for applying PdM and tracking systems; Break down a typical electrical bill to calculate energy savings; Use return on investment (ROI) calculations to identify the best and highest value options, strategies and tips for substantiating your energy reduction maintenance strategies.

  1. Radiografias em inclinação lateral como fator preditivo da correção cirúrgica na escoliose idiopática do adolescente Bending radiographs as a predictive factor in surgical correction of adolescent idiopathic scoliosis

    Alberto Ofenhejm Gotfryd

    2011-10-01

    Full Text Available OBJETIVO: Avaliar a utilização de radiografias com inclinação lateral ativa em decúbito dorsal como fator preditivo da correção cirúrgica da curva torácica principal em pacientes com escoliose idiopática do adolescente (EIA. MÉTODOS: Foram avaliados, clínica e radiograficamente, 20 pacientes portadores de EIA tipo Lenke 1A e 1B operados por via posterior, utilizando nas montagens apenas parafusos pediculares. A flexibilidade das curvas foi calculada através de radiografias em inclinação lateral supina ativa. Os valores obtidos no pré-operatório para a curva torácica principal foram incluídos em uma equação matemática proposta por Cheung et al com a finalidade de predizer o resultado angular esperado após a correção cirúrgica. Após isto, foi realizado estudo estatístico de significância entre o valor predito e o real pós-operatório. RESULTADOS: Houve significância estatística para todos os casos estudados em relação ao valor predito pré-operatoriamente e os achados radiográficos do pós-operatório imediato (p OBJECTIVE: To evaluate the use of x-rays in dorsal decubitus, as a predictive factor for surgical correction of the main thoracic curve using pedicle screws, on patients with idiopathic adolescent scoliosis. METHOD: Twenty patients with idiopathic adolescent scoliosis of Lenke types 1A and 1B who were operated using a technique only involving pedicle screws by means of the posterior route were evaluated clinically and radiographically. The curve flexibility was calculated by means of active supine lateral oblique radiographs. The postoperative values for the main thoracic curve were included in a mathematical equation proposed by Cheung et al., with the aim of predicting the expected angular result from the surgical correction. The difference between the expected and actual postoperative results was then investigated regarding its statistical significance. RESULTS: There was statistical significance for

  2. Dopamine reward prediction error coding

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  3. Water Column Correction for Coral Reef Studies by Remote Sensing

    Zoffoli, Maria Laura; Frouin, Robert; Kampel, Milton

    2014-01-01

    Human activity and natural climate trends constitute a major threat to coral reefs worldwide. Models predict a significant reduction in reef spatial extension together with a decline in biodiversity in the relatively near future. In this context, monitoring programs to detect changes in reef ecosystems are essential. In recent years, coral reef mapping using remote sensing data has benefited from instruments with better resolution and computational advances in storage and processing capabilities. However, the water column represents an additional complexity when extracting information from submerged substrates by remote sensing that demands a correction of its effect. In this article, the basic concepts of bottom substrate remote sensing and water column interference are presented. A compendium of methodologies developed to reduce water column effects in coral ecosystems studied by remote sensing that include their salient features, advantages and drawbacks is provided. Finally, algorithms to retrieve the bottom reflectance are applied to simulated data and actual remote sensing imagery and their performance is compared. The available methods are not able to completely eliminate the water column effect, but they can minimize its influence. Choosing the best method depends on the marine environment, available input data and desired outcome or scientific application. PMID:25215941

  4. Water Column Correction for Coral Reef Studies by Remote Sensing

    Maria Laura Zoffoli

    2014-09-01

    Full Text Available Human activity and natural climate trends constitute a major threat to coral reefs worldwide. Models predict a significant reduction in reef spatial extension together with a decline in biodiversity in the relatively near future. In this context, monitoring programs to detect changes in reef ecosystems are essential. In recent years, coral reef mapping using remote sensing data has benefited from instruments with better resolution and computational advances in storage and processing capabilities. However, the water column represents an additional complexity when extracting information from submerged substrates by remote sensing that demands a correction of its effect. In this article, the basic concepts of bottom substrate remote sensing and water column interference are presented. A compendium of methodologies developed to reduce water column effects in coral ecosystems studied by remote sensing that include their salient features, advantages and drawbacks is provided. Finally, algorithms to retrieve the bottom reflectance are applied to simulated data and actual remote sensing imagery and their performance is compared. The available methods are not able to completely eliminate the water column effect, but they can minimize its influence. Choosing the best method depends on the marine environment, available input data and desired outcome or scientific application.

  5. Water column correction for coral reef studies by remote sensing.

    Zoffoli, Maria Laura; Frouin, Robert; Kampel, Milton

    2014-09-11

    Human activity and natural climate trends constitute a major threat to coral reefs worldwide. Models predict a significant reduction in reef spatial extension together with a decline in biodiversity in the relatively near future. In this context, monitoring programs to detect changes in reef ecosystems are essential. In recent years, coral reef mapping using remote sensing data has benefited from instruments with better resolution and computational advances in storage and processing capabilities. However, the water column represents an additional complexity when extracting information from submerged substrates by remote sensing that demands a correction of its effect. In this article, the basic concepts of bottom substrate remote sensing and water column interference are presented. A compendium of methodologies developed to reduce water column effects in coral ecosystems studied by remote sensing that include their salient features, advantages and drawbacks is provided. Finally, algorithms to retrieve the bottom reflectance are applied to simulated data and actual remote sensing imagery and their performance is compared. The available methods are not able to completely eliminate the water column effect, but they can minimize its influence. Choosing the best method depends on the marine environment, available input data and desired outcome or scientific application.

  6. Revisiting the Operating Room Basics

    Tushar Chakravorty

    2015-12-01

    Full Text Available Young doctors walking into the operating room are eager to develop their skills to become efficient and knowledgeable professionals in future. But precious little is done to actively develop the basic practical skills of the budding doctors. They remain unaware about the layout of the operating room, the OR etiquette and often do not have sound scientific understanding and importance of meticulous execution of the basic operating room protocols. This article stresses the need to develop the basics of OR protocol and to improve the confidence of the young doctor by strengthening his foundation by showing him that attention to the basics of medical care and empathy for the patient can really make a difference to the outcome of a treatment.

  7. New Federalism: Back to Basics.

    Durenberger, Dave

    1983-01-01

    The senator explains the basic concepts of New Federalism, including a rethinking of responsibilities and intergovernmental relations and a reconsideration of the role of state and local government. (SK)

  8. Basic statements of relativity theory

    Wolfgang Muschik

    2010-04-01

    Full Text Available Some basic statements of relativity theory, starting out with geometry and observers up to Einstein's field equations, are collected in a systematical order without any proof, to serve as a short survey of tools and results.

  9. Dental Health: The Basic Facts

    Dental Health THE BASIC FACTS MULTIPLE SCLEROSIS Kim, diagnosed in 1986 People with a chronic disease may neglect their general health and wellness, research shows. Dental care is no exception. A tendency to focus ...

  10. Basic principles of concrete structures

    Gu, Xianglin; Zhou, Yong

    2016-01-01

    Based on the latest version of designing codes both for buildings and bridges (GB50010-2010 and JTG D62-2004), this book starts from steel and concrete materials, whose properties are very important to the mechanical behavior of concrete structural members. Step by step, analysis of reinforced and prestressed concrete members under basic loading types (tension, compression, flexure, shearing and torsion) and environmental actions are introduced. The characteristic of the book that distinguishes it from other textbooks on concrete structures is that more emphasis has been laid on the basic theories of reinforced concrete and the application of the basic theories in design of new structures and analysis of existing structures. Examples and problems in each chapter are carefully designed to cover every important knowledge point. As a basic course for undergraduates majoring in civil engineering, this course is different from either the previously learnt mechanics courses or the design courses to be learnt. Compa...

  11. Transforming Defense Basic Research Strategy

    Fountain, Augustus W

    2004-01-01

    ... technologies for development. With a basic research budget less than half that of the National Science Foundation and a mere fraction that of the NIH the DoD can no longer afford to pursue lofty science education goals...

  12. Transforming Defense Basic Research Strategy

    Fountain, Augustus W

    2004-01-01

    .... Public funding of basic research for the DoD during the Cold War was successful because it minimized risk through taking maximum advantage of long term research projects that produced rather mature...

  13. Basic hypergeometry of supersymmetric dualities

    Gahramanov, Ilmar, E-mail: ilmar.gahramanov@aei.mpg.de [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Mühlenberg 1, D14476 Potsdam (Germany); Institut für Physik und IRIS Adlershof, Humboldt-Universität zu Berlin, Zum Grossen Windkanal 6, D12489 Berlin (Germany); Institute of Radiation Problems ANAS, B.Vahabzade 9, AZ1143 Baku (Azerbaijan); Department of Mathematics, Khazar University, Mehseti St. 41, AZ1096, Baku (Azerbaijan); Rosengren, Hjalmar, E-mail: hjalmar@chalmers.se [Department of Mathematical Sciences, Chalmers University of Technology and University of Gothenburg, SE-412 96 Göteborg (Sweden)

    2016-12-15

    We introduce several new identities combining basic hypergeometric sums and integrals. Such identities appear in the context of superconformal index computations for three-dimensional supersymmetric dual theories. We give both analytic proofs and physical interpretations of the presented identities.

  14. Basic HIV/AIDS Statistics

    ... HIV Syndicated Content Website Feedback HIV/AIDS Basic Statistics Recommend on Facebook Tweet Share Compartir HIV and ... HIV. Interested in learning more about CDC's HIV statistics? Terms, Definitions, and Calculations Used in CDC HIV ...

  15. Influences of misprediction costs on solar flare prediction

    Huang, Xin; Wang, HuaNing; Dai, XingHua

    2012-10-01

    The mispredictive costs of flaring and non-flaring samples are different for different applications of solar flare prediction. Hence, solar flare prediction is considered a cost sensitive problem. A cost sensitive solar flare prediction model is built by modifying the basic decision tree algorithm. Inconsistency rate with the exhaustive search strategy is used to determine the optimal combination of magnetic field parameters in an active region. These selected parameters are applied as the inputs of the solar flare prediction model. The performance of the cost sensitive solar flare prediction model is evaluated for the different thresholds of solar flares. It is found that more flaring samples are correctly predicted and more non-flaring samples are wrongly predicted with the increase of the cost for wrongly predicting flaring samples as non-flaring samples, and the larger cost of wrongly predicting flaring samples as non-flaring samples is required for the higher threshold of solar flares. This can be considered as the guide line for choosing proper cost to meet the requirements in different applications.

  16. Basic Testing of the DUCHAMP Source Finder

    Westmeier, T.; Popping, A.; Serra, P.

    2012-01-01

    This paper presents and discusses the results of basic source finding tests in three dimensions (using spectroscopic data cubes) with DUCHAMP, the standard source finder for the Australian Square Kilometre Array Pathfinder. For this purpose, we generated different sets of unresolved and extended Hi model sources. These models were then fed into DUCHAMP, using a range of different parameters and methods provided by the software. The main aim of the tests was to study the performance of DUCHAMP on sources with different parameters and morphologies and assess the accuracy of DUCHAMP's source parametrisation. Overall, we find DUCHAMP to be a powerful source finder capable of reliably detecting sources down to low signal-to-noise ratios and accurately measuring their position and velocity. In the presence of noise in the data, DUCHAMP's measurements of basic source parameters, such as spectral line width and integrated flux, are affected by systematic errors. These errors are a consequence of the effect of noise on the specific algorithms used by DUCHAMP for measuring source parameters in combination with the fact that the software only takes into account pixels above a given flux threshold and hence misses part of the flux. In scientific applications of DUCHAMP these systematic errors would have to be corrected for. Alternatively, DUCHAMP could be used as a source finder only, and source parametrisation could be done in a second step using more sophisticated parametrisation algorithms.

  17. Pressure ulcers: Back to the basics.

    Agrawal, Karoon; Chauhan, Neha

    2012-05-01

    Pressure ulcer in an otherwise sick patient is a matter of concern for the care givers as well as the medical personnel. A lot has been done to understand the disease process. So much so that USA and European countries have established advisory panels in their respective continents. Since the establishment of these organizations, the understanding of the pressure ulcer has improved significantly. The authors feel that the well documented and well publicized definition of pressure ulcer is somewhat lacking in the correct description of the disease process. Hence, a modified definition has been presented. This disease is here to stay. In the process of managing these ulcers the basic pathology needs to be understood well. Pressure ischemia is the main reason behind the occurrence of ulceration. Different extrinsic and intrinsic factors have been described in detail with review of literature. There are a large number of risk factors causing ulceration. The risk assessment scales have eluded the surgical literature and mostly remained in nursing books and websites. These scales have been reproduced for completion of the basics on decubitus ulcer. The classification of the pressure sores has been given in a comparative form to elucidate that most of the classifications are the same except for minor variations. The management of these ulcers is ever evolving but the age old saying of "prevention is better than cure" suits this condition the most.

  18. Basic self-disturbance, neurocognition and metacognition

    Koren, Dan; Scheyer, Ravit; Reznik, Noa

    2017-01-01

    AIM: The goal of this pilot study was to assess the association between basic self-disturbance (SD) and deficits in neurocognitive and metacognitive functioning among help-seeking adolescents with and without attenuated psychosis syndrome (APS). METHODS: Sixty-one non-psychotic, help-seeking adol......AIM: The goal of this pilot study was to assess the association between basic self-disturbance (SD) and deficits in neurocognitive and metacognitive functioning among help-seeking adolescents with and without attenuated psychosis syndrome (APS). METHODS: Sixty-one non-psychotic, help...... recognition) domains. After each answer, subjects were also requested to indicate their level of confidence in the answer and to decide whether they desired it to be "counted" toward their total score on the task. Each volunteered answer earned a 5-cent gain if correct, but an equal fine if wrong. RESULTS......, it was not moderated by the presence of APS. CONCLUSIONS: These pilot results provide preliminary support a modest association between SD and metacognition, which is not reducible to neurocognition and APS. In addition, they raise an intriguing possibility regarding metacognitive monitoring and control being...

  19. Pressure ulcers: Back to the basics

    Karoon Agrawal

    2012-01-01

    Full Text Available Pressure ulcer in an otherwise sick patient is a matter of concern for the care givers as well as the medical personnel. A lot has been done to understand the disease process. So much so that USA and European countries have established advisory panels in their respective continents. Since the establishment of these organizations, the understanding of the pressure ulcer has improved significantly. The authors feel that the well documented and well publicized definition of pressure ulcer is somewhat lacking in the correct description of the disease process. Hence, a modified definition has been presented. This disease is here to stay. In the process of managing these ulcers the basic pathology needs to be understood well. Pressure ischemia is the main reason behind the occurrence of ulceration. Different extrinsic and intrinsic factors have been described in detail with review of literature. There are a large number of risk factors causing ulceration. The risk assessment scales have eluded the surgical literature and mostly remained in nursing books and websites. These scales have been reproduced for completion of the basics on decubitus ulcer. The classification of the pressure sores has been given in a comparative form to elucidate that most of the classifications are the same except for minor variations. The management of these ulcers is ever evolving but the age old saying of "prevention is better than cure" suits this condition the most.

  20. Pressure ulcers: Back to the basics

    Agrawal, Karoon; Chauhan, Neha

    2012-01-01

    Pressure ulcer in an otherwise sick patient is a matter of concern for the care givers as well as the medical personnel. A lot has been done to understand the disease process. So much so that USA and European countries have established advisory panels in their respective continents. Since the establishment of these organizations, the understanding of the pressure ulcer has improved significantly. The authors feel that the well documented and well publicized definition of pressure ulcer is somewhat lacking in the correct description of the disease process. Hence, a modified definition has been presented. This disease is here to stay. In the process of managing these ulcers the basic pathology needs to be understood well. Pressure ischemia is the main reason behind the occurrence of ulceration. Different extrinsic and intrinsic factors have been described in detail with review of literature. There are a large number of risk factors causing ulceration. The risk assessment scales have eluded the surgical literature and mostly remained in nursing books and websites. These scales have been reproduced for completion of the basics on decubitus ulcer. The classification of the pressure sores has been given in a comparative form to elucidate that most of the classifications are the same except for minor variations. The management of these ulcers is ever evolving but the age old saying of “prevention is better than cure” suits this condition the most. PMID:23162223

  1. Basic petroleum research. Final report

    Roesjoe, Bjarne; Stiksrud, Helge

    2004-01-01

    An overview of projects in the field of basic petroleum research (PetroForsk) is presented. A brief presentation of some of the projects is included, as well as political comments on the value of these projects. The research program Basic Petroleum Research (PetroForsk) was established in 1998 and ended in 2004. The program has been part of the Research Council of Norway's long-term effort in petroleum research (ml)

  2. Distribution load forecast with interactive correction of horizon loads

    Glamochanin, V.; Andonov, D.; Gagovski, I.

    1994-01-01

    This paper presents the interactive distribution load forecast application that performs the distribution load forecast with interactive correction of horizon loads. It consists of two major parts implemented in Fortran and Visual Basic. The Fortran part is used for the forecasts computations. It consists of two methods: Load Transfer Coupling Curve Fitting (LTCCF) and load Forecast Using Curve Shape Clustering (FUCSC). LTCCF is used to 'correct' the contaminated data because of load transfer among neighboring distribution areas. FUCSC uses curve shape clustering to forecast the distribution loads of small areas. The forecast for each small area is achieved by using the shape of corresponding cluster curve. The comparison of forecasted loads of the area with historical data will be used as a tool for the correction of the estimated horizon load. The Visual Basic part is used to provide flexible interactive user-friendly environment. (author). 5 refs., 3 figs

  3. Study of tip loss corrections using CFD rotor computations

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2014-01-01

    Tip loss correction is known to play an important role for engineering prediction of wind turbine performance. There are two different types of tip loss corrections: tip corrections on momentum theory and tip corrections on airfoil data. In this paper, we study the latter using detailed CFD...... computations for wind turbines with sharp tip. Using the technique of determination of angle of attack and the CFD results for a NordTank 500 kW rotor, airfoil data are extracted and a new tip loss function on airfoil data is derived. To validate, BEM computations with the new tip loss function are carried out...... and compared with CFD results for the NordTank 500 kW turbine and the NREL 5 MW turbine. Comparisons show that BEM with the new tip loss function can predict correctly the loading near the blade tip....

  4. Development of the Heated Length Correction Factor

    Park, Ho-Young; Kim, Kang-Hoon; Nahm, Kee-Yil; Jung, Yil-Sup; Park, Eung-Jun

    2008-01-01

    The Critical Heat Flux (CHF) on a nuclear fuel is defined by the function of flow channel geometry and flow condition. According to the selection of the explanatory variable, there are three hypotheses to explain CHF at uniformly heated vertical rod (inlet condition hypothesis, exit condition hypothesis, local condition hypothesis). For inlet condition hypothesis, CHF is characterized by function of system pressure, rod diameter, rod length, mass flow and inlet subcooling. For exit condition hypothesis, exit quality substitutes for inlet subcooling. Generally the heated length effect on CHF in exit condition hypothesis is smaller than that of other variables. Heated length is usually excluded in local condition hypothesis to describe the CHF with only local fluid conditions. Most of commercial plants currently use the empirical CHF correlation based on local condition hypothesis. Empirical CHF correlation is developed by the method of fitting the selected sensitive local variables to CHF test data using the multiple non-linear regression. Because this kind of method can not explain physical meaning, it is difficult to reflect the proper effect of complex geometry. So the recent CHF correlation development strategy of nuclear fuel vendor is making the basic CHF correlation which consists of basic flow variables (local fluid conditions) at first, and then the geometrical correction factors are compensated additionally. Because the functional forms of correction factors are determined from the independent test data which represent the corresponding geometry separately, it can be applied to other CHF correlation directly only with minor coefficient modification

  5. Truncation correction for oblique filtering lines

    Hoppe, Stefan; Hornegger, Joachim; Lauritsch, Guenter; Dennerlein, Frank; Noo, Frederic

    2008-01-01

    State-of-the-art filtered backprojection (FBP) algorithms often define the filtering operation to be performed along oblique filtering lines in the detector. A limited scan field of view leads to the truncation of those filtering lines, which causes artifacts in the final reconstructed volume. In contrast to the case where filtering is performed solely along the detector rows, no methods are available for the case of oblique filtering lines. In this work, the authors present two novel truncation correction methods which effectively handle data truncation in this case. Method 1 (basic approach) handles data truncation in two successive preprocessing steps by applying a hybrid data extrapolation method, which is a combination of a water cylinder extrapolation and a Gaussian extrapolation. It is independent of any specific reconstruction algorithm. Method 2 (kink approach) uses similar concepts for data extrapolation as the basic approach but needs to be integrated into the reconstruction algorithm. Experiments are presented from simulated data of the FORBILD head phantom, acquired along a partial-circle-plus-arc trajectory. The theoretically exact M-line algorithm is used for reconstruction. Although the discussion is focused on theoretically exact algorithms, the proposed truncation correction methods can be applied to any FBP algorithm that exposes oblique filtering lines.

  6. RCRA corrective action and closure

    1995-02-01

    This information brief explains how RCRA corrective action and closure processes affect one another. It examines the similarities and differences between corrective action and closure, regulators' interests in RCRA facilities undergoing closure, and how the need to perform corrective action affects the closure of DOE's permitted facilities and interim status facilities

  7. Rethinking political correctness.

    Ely, Robin J; Meyerson, Debra E; Davidson, Martin N

    2006-09-01

    Legal and cultural changes over the past 40 years ushered unprecedented numbers of women and people of color into companies' professional ranks. Laws now protect these traditionally underrepresented groups from blatant forms of discrimination in hiring and promotion. Meanwhile, political correctness has reset the standards for civility and respect in people's day-to-day interactions. Despite this obvious progress, the authors' research has shown that political correctness is a double-edged sword. While it has helped many employees feel unlimited by their race, gender, or religion,the PC rule book can hinder people's ability to develop effective relationships across race, gender, and religious lines. Companies need to equip workers with skills--not rules--for building these relationships. The authors offer the following five principles for healthy resolution of the tensions that commonly arise over difference: Pause to short-circuit the emotion and reflect; connect with others, affirming the importance of relationships; question yourself to identify blind spots and discover what makes you defensive; get genuine support that helps you gain a broader perspective; and shift your mind-set from one that says, "You need to change," to one that asks, "What can I change?" When people treat their cultural differences--and related conflicts and tensions--as opportunities to gain a more accurate view of themselves, one another, and the situation, trust builds and relationships become stronger. Leaders should put aside the PC rule book and instead model and encourage risk taking in the service of building the organization's relational capacity. The benefits will reverberate through every dimension of the company's work.

  8. Basic considerations in predicting error probabilities in human task performance

    Fleishman, E.A.; Buffardi, L.C.; Allen, J.A.; Gaskins, R.C. III

    1990-04-01

    It is well established that human error plays a major role in the malfunctioning of complex systems. This report takes a broad look at the study of human error and addresses the conceptual, methodological, and measurement issues involved in defining and describing errors in complex systems. In addition, a review of existing sources of human reliability data and approaches to human performance data base development is presented. Alternative task taxonomies, which are promising for establishing the comparability on nuclear and non-nuclear tasks, are also identified. Based on such taxonomic schemes, various data base prototypes for generalizing human error rates across settings are proposed. 60 refs., 3 figs., 7 tabs

  9. Basic notes on statistics for probabilistic service life predictions

    Siemes, A.J.M.

    1999-01-01

    Everybody has some notion what is meant by the term probability. Answers like 'probably yes' or 'probably no' are part of the daily language. The word 'probably' indicates that we are not completely certain about the present status or future development of the item under discussion: two or more

  10. Temporal Deductive Verification of Basic ASM Models

    Daho, Hocine El-Habib; University of Oran; Benhamamouch, Djillali; University of Oran

    2010-01-01

    Abstract State Machines (ASMs, for short) provide a practical new computational model which has been applied in the area of software engineering for systems design and analysis. However, reasoning about ASM models occurs, not within a formal deductive system, but basically in the classical informal proofs style of mathematics. Several formal verification approaches for proving correctness of ASM models have been investigated. In this paper we consider the use of the TLA+logic for the deductive...

  11. A basic introduction to statistics for the orthopaedic surgeon.

    Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef

    2012-02-01

    Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.

  12. Structure-based design of ligands for protein basic domains: Application to the HIV-1 Tat protein

    Filikov, Anton V.; James, Thomas L.

    1998-05-01

    A methodology has been developed for designing ligands to bind a flexible basic protein domain where the structure of the domain is essentially known. It is based on an empirical binding free energy function developed for highly charged complexes and on Monte Carlo simulations in internal coordinates with both the ligand and the receptor being flexible. HIV-1 encodes a transactivating regulatory protein called Tat. Binding of the basic domain of Tat to TAR RNA is required for efficient transcription of the viral genome. The structure of a biologically active peptide containing the Tat basic RNA-binding domain is available from NMR studies. The goal of the current project is to design a ligand which will bind to that basic domain and potentially inhibit the TAR-Tat interaction. The basic domain contains six arginine and two lysine residues. Our strategy was to design a ligand for arginine first and then a superligand for the basic domain by joining arginine ligands with a linker. Several possible arginine ligands were obtained by searching the Available Chemicals Directory with DOCK 3.5 software. Phytic acid, which can potentially bind multiple arginines, was chosen as a building block for the superligand. Calorimetric binding studies of several compounds to methylguanidine and Arg-/Lys-containing peptides were performed. The data were used to develop an empirical binding free energy function for prediction of affinity of the ligands for the Tat basic domain. Modeling of the conformations of the complexes with both the superligand and the basic domain being flexible has been carried out via Biased Probability Monte Carlo (BPMC) simulations in internal coordinates (ICM 2.6 suite of programs). The simulations used parameters to ensure correct folding, i.e., consistent with the experimental NMR structure of a 25-residue Tat peptide, from a random starting conformation. Superligands for the basic domain were designed by joining together two molecules of phytic acid with

  13. Color correction optimization with hue regularization

    Zhang, Heng; Liu, Huaping; Quan, Shuxue

    2011-01-01

    Previous work has suggested that observers are capable of judging the quality of an image without any knowledge of the original scene. When no reference is available, observers can extract the apparent objects in an image and compare them with the typical colors of similar objects recalled from their memories. Some generally agreed upon research results indicate that although perfect colorimetric rendering is not conspicuous and color errors can be well tolerated, the appropriate rendition of certain memory colors such as skin, grass, and sky is an important factor in the overall perceived image quality. These colors are appreciated in a fairly consistent manner and are memorized with slightly different hues and higher color saturation. The aim of color correction for a digital color pipeline is to transform the image data from a device dependent color space to a target color space, usually through a color correction matrix which in its most basic form is optimized through linear regressions between the two sets of data in two color spaces in the sense of minimized Euclidean color error. Unfortunately, this method could result in objectionable distortions if the color error biased certain colors undesirably. In this paper, we propose a color correction optimization method with preferred color reproduction in mind through hue regularization and present some experimental results.

  14. Japan's new basic energy plan

    Duffield, John S.; Woodall, Brian

    2011-01-01

    In June 2010, the Japanese cabinet adopted a new Basic Energy Plan (BEP). This was the third such plan that the government has approved since the passage of the Basic Act on Energy Policy in 2002, and it represents the most significant statement of Japanese energy policy in more than four years, since the publication of the New National Energy Strategy (NNES) in 2006. Perhaps more than its predecessors, moreover, the new plan establishes a number of ambitious targets as well as more detailed measures for achieving those targets. Among the targets are a doubling of Japan's 'energy independence ratio,' a doubling of the percentage of electricity generated by renewable sources and nuclear power, and a 30 percent reduction in energy-related CO 2 emissions, all by 2030. This paper explains the origins of the 2010 BEP and why it was adopted. It then describes the content of the plan and how it differs from the NNES. A third section analyzes the appropriateness of the new goals and targets contained in the BEP and their feasibility, finding that achievement of many of the targets was likely to be quite challenging even before the March 2011 earthquake, tsunami, and nuclear crisis. - Highlights: → Origins of Japan's new Basic Energy Plan. → Content of Japan's new Basic Energy Plan. → Feasibility of achieving the targets in Japan's new Basic Energy Plan. → Impact of 2011 earthquake and tsunami on Japanese energy policy.

  15. Correction to toporek (2014).

    2015-01-01

    Reports an error in "Pedagogy of the privileged: Review of Deconstructing Privilege: Teaching and Learning as Allies in the Classroom" by Rebecca L. Toporek (Cultural Diversity and Ethnic Minority Psychology, 2014[Oct], Vol 20[4], 621-622). This article was originally published online incorrectly as a Brief Report. The article authored by Rebecca L. Toporek has been published correctly as a Book Review in the October 2014 print publication (Vol. 20, No. 4, pp. 621-622. http://dx.doi.org/10.1037/a0036529). (The following abstract of the original article appeared in record 2014-42484-006.) Reviews the book, Deconstructing Privilege: Teaching and Learning as Allies in the Classroom edited by Kim A. Case (2013). The purpose of this book is to provide a collection of resources for those teaching about privilege directly, much of this volume may be useful for expanding the context within which educators teach all aspects of psychology. Understanding the history and systems of psychology, clinical practice, research methods, assessment, and all the core areas of psychology could be enhanced by consideration of the structural framework through which psychology has developed and is maintained. The book presents a useful guide for educators, and in particular, those who teach about systems of oppression and privilege directly. For psychologists, this guide provides scholarship and concrete strategies for facilitating students' awareness of multiple dimensions of privilege across content areas. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  16. Radiation protection: A correction

    1972-01-01

    An error in translation inadvertently distorted the sense of a paragraph in the article entitled 'Ecological Aspects of Radiation Protection', by Dr. P. Recht, which appeared in the Bulletin, Volume 14, No. 2 earlier this year. In the English text the error appears on Page 28, second paragraph, which reads, as published: 'An instance familiar to radiation protection specialists, which has since come to be regarded as a classic illustration of this approach, is the accidental release at the Windscale nuclear centre in the north of England.' In the French original of this text no reference was made, or intended, to the accidental release which took place in 1957; the reference was to the study of the critical population group exposed to routine releases from the centre, as the footnote made clear. A more correct translation of the relevant sentence reads: 'A classic example of this approach, well-known to radiation protection specialists, is that of releases from the Windscale nuclear centre, in the north of England.' A second error appeared in the footnote already referred to. In all languages, the critical population group studied in respect of the Windscale releases is named as that of Cornwall; the reference should be, of course, to that part of the population of Wales who eat laver bread. (author)

  17. Thermodynamics of Error Correction

    Pablo Sartori

    2015-12-01

    Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  18. Cross plane scattering correction

    Shao, L.; Karp, J.S.

    1990-01-01

    Most previous scattering correction techniques for PET are based on assumptions made for a single transaxial plane and are independent of axial variations. These techniques will incorrectly estimate the scattering fraction for volumetric PET imaging systems since they do not take the cross-plane scattering into account. In this paper, the authors propose a new point source scattering deconvolution method (2-D). The cross-plane scattering is incorporated into the algorithm by modeling a scattering point source function. In the model, the scattering dependence both on axial and transaxial directions is reflected in the exponential fitting parameters and these parameters are directly estimated from a limited number of measured point response functions. The authors' results comparing the standard in-plane point source deconvolution to the authors' cross-plane source deconvolution show that for a small source, the former technique overestimates the scatter fraction in the plane of the source and underestimate the scatter fraction in adjacent planes. In addition, the authors also propose a simple approximation technique for deconvolution

  19. Adult Basic Education: Aligning Adult Basic Education and Postsecondary Education

    Texas Higher Education Coordinating Board, 2008

    2008-01-01

    In 2007, the 80th Texas Legislature included a rider to the General Appropriations Act for the Texas Higher Education Coordinating Board. The rider directed the agency to coordinate with the Texas Education Agency to develop and implement plans to align adult basic education with postsecondary education. The Coordinating Board, in collaboration…

  20. E-Basics: Online Basic Training in Program Evaluation

    Silliman, Ben

    2016-01-01

    E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…

  1. Basic research for environmental restoration

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs

  2. Basic research for environmental restoration

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs.

  3. [Etiopathogenesis of diarrhea and basic principles of diagnosis and therapy].

    Ehrmann, J

    2002-06-01

    Acute diarrhoea is worldwide the second most frequent disease after acute inflammations of the airways. Chronic diarrhoea is a less frequent disease, nevertheless the GP, specialist in internal medicine or gastroenterologist encounters it very frequently. For correct understanding of basic diagnostic and therapeutic principles of diarrhoea knowledge of its etiopathogenesis is necessary. In the submitted review the author mentions the functions of the small and large intestine and their part in the development of diarrhoea. He gives also the definition and classification of diarrhoea. The author presents the basic characteristics of osmotic, secretory, inflammatory, motor diarrhoea and diarrhoea associated with increased intestinal filtration. The basic diagnostic and therapeutic principles are in the last part of the review which has an educational character.

  4. Basic linear partial differential equations

    Treves, Francois

    1975-01-01

    Focusing on the archetypes of linear partial differential equations, this text for upper-level undergraduates and graduate students features most of the basic classical results. The methods, however, are decidedly nontraditional: in practically every instance, they tend toward a high level of abstraction. This approach recalls classical material to contemporary analysts in a language they can understand, as well as exploiting the field's wealth of examples as an introduction to modern theories.The four-part treatment covers the basic examples of linear partial differential equations and their

  5. Stereochemistry basic concepts and applications

    Nógrádi, M

    2013-01-01

    Stereochemistry: Basic Concepts and Applications is a three-chapter text that introduces the basic principles and concepts of stereochemistry, as well as its application to organic chemistry application.Chapter 1 describes first the stereochemistry of the ground state, specifically the configuration and conformation of organic compounds, as well as the most important methods for its investigation. This chapter also deals with the kinetics of conformational changes and provides an overview of the so-called ""applied stereochemistry"". Chapter 2 focuses on the analysis of the internal motions of

  6. Modeling coherent errors in quantum error correction

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  7. Exemplar-based human action pose correction.

    Shen, Wei; Deng, Ke; Bai, Xiang; Leyvand, Tommer; Guo, Baining; Tu, Zhuowen

    2014-07-01

    The launch of Xbox Kinect has built a very successful computer vision product and made a big impact on the gaming industry. This sheds lights onto a wide variety of potential applications related to action recognition. The accurate estimation of human poses from the depth image is universally a critical step. However, existing pose estimation systems exhibit failures when facing severe occlusion. In this paper, we propose an exemplar-based method to learn to correct the initially estimated poses. We learn an inhomogeneous systematic bias by leveraging the exemplar information within a specific human action domain. Furthermore, as an extension, we learn a conditional model by incorporation of pose tags to further increase the accuracy of pose correction. In the experiments, significant improvements on both joint-based skeleton correction and tag prediction are observed over the contemporary approaches, including what is delivered by the current Kinect system. Our experiments for the facial landmark correction also illustrate that our algorithm can improve the accuracy of other detection/estimation systems.

  8. Comparison of classical methods for blade design and the influence of tip correction on rotor performance

    Sørensen, Jens Nørkær; Okulov, Valery; Mikkelsen, Robert Flemming

    2016-01-01

    The classical blade-element/momentum (BE/M) method, which is used together with different types of corrections (e.g. the Prandtl or Glauert tip correction), is today the most basic tool in the design of wind turbine rotors. However, there are other classical techniques based on a combination...

  9. Food systems in correctional settings

    Smoyer, Amy; Kjær Minke, Linda

    management of food systems may improve outcomes for incarcerated people and help correctional administrators to maximize their health and safety. This report summarizes existing research on food systems in correctional settings and provides examples of food programmes in prison and remand facilities......Food is a central component of life in correctional institutions and plays a critical role in the physical and mental health of incarcerated people and the construction of prisoners' identities and relationships. An understanding of the role of food in correctional settings and the effective......, including a case study of food-related innovation in the Danish correctional system. It offers specific conclusions for policy-makers, administrators of correctional institutions and prison-food-service professionals, and makes proposals for future research....

  10. Corrective justice and contract law

    Martín Hevia

    2010-06-01

    Full Text Available This article suggests that the central aspects of contract law in various jurisdictions can be explained within the idea of corrective justice. The article is divided into three parts. The first part distinguishes between corrective justice and distributive justice. The second part describes contract law. The third part focuses on actions for breach of contract and within that context reflects upon the idea of corrective justice.

  11. Corrective justice and contract law

    Martín Hevia

    2010-01-01

    This article suggests that the central aspects of contract law in various jurisdictions can be explained within the idea of corrective justice. The article is divided into three parts. The first part distinguishes between corrective justice and distributive justice. The second part describes contract law. The third part focuses on actions for breach of contract and within that context reflects upon the idea of corrective justice.

  12. Welding. Performance Objectives. Basic Course.

    Vincent, Kenneth

    Several intermediate performance objectives and corresponding criterion measures are listed for each of eight terminal objectives for a basic welding course. The materials were developed for a 36-week (2 hours daily) course developed to teach the fundamentals of welding shop work, to become familiar with the operation of the welding shop…

  13. Thermionics basic principles of electronics

    Jenkins, J; Ashhurst, W

    2013-01-01

    Basic Principles of Electronics, Volume I : Thermionics serves as a textbook for students in physics. It focuses on thermionic devices. The book covers topics on electron dynamics, electron emission, and the themionic vacuum diode and triode. Power amplifiers, oscillators, and electronic measuring equipment are studied as well. The text will be of great use to physics and electronics students, and inventors.

  14. Guarani Basic Course, Part II.

    Blair, Robert W.; And Others

    This volume of the basic course in Guarani (the indigenous language of Paraguay) contains the core stage, or class-instructional phase, of the ten units presented in Volume One. These units contain explanations, exercises, dialogues, various types of pattern drills, suggestions for games and communication activities, and various types of…

  15. Guarani Basic Course, Part I.

    Blair, Robert W.; And Others

    This is the first in a two-volume basic course in Guarani, the indigenous language of Paraguay. The volume consists of an introduction to the Guarani language, some general principles for adult language-learning, and ten instructional units. Because the goal of the course is to encourage and lead the learner to communicate in Guarani in class and…

  16. Accounting & Finance; a Basic Introduction

    drs. Ewoud Jansen

    2011-01-01

    The book is about finance and accounting, subjects widely discussed in many other books. What sets this book apart from most others is that it discusses all the basic aspects of finance and accounting in one single textbook. Three areas of interest are discussed: Financial Management; Management

  17. Basic biology in health physics

    Wells, J.

    1976-10-01

    This report describes the consequences of the interaction of ionizing radiation with living cells and tissues. The basic processes of living cells, which are relevant to an understanding of health physics problems, are outlined with particular reference to cell-death, cancer induction and genetic effects. (author)

  18. Basic Concepts of Surface Physics

    Degras, D A

    1974-07-01

    The basic concepts of surface physics are given in this paper which deals mainly with the thermodynamics of metal surfaces. one finds also a short review of vibrational and electronic properties. Written for a Summer School, the text provides numerous references.

  19. Play Therapy: Basics and Beyond.

    Kottman, Terry

    This book provides an atheoretical orientation to basic concepts involved in play therapy and an introduction to different skills used in play therapy. The demand for mental professionals and school counselors who have training and expertise in using play as a therapeutic tool when working with children has increased tremendously. In response to…

  20. Vocational Interests and Basic Values.

    Sagiv, Lilach

    2002-01-01

    Study 1 (n=97) provided evidence of the correlation of Holland's model of vocational interests with Schwartz' theory of basic values. Realistic career interests did not correlate with values. Study 2 (n=545) replicated these findings, showing a better match for individuals who had reached a career decision in counseling than for the undecided.…

  1. Basic safety principles: Lessons learned

    Erp, J.B. van [Argonne National Lab., IL (United States)

    1997-09-01

    The presentation reviews the following issues: basic safety principles and lessons learned; some conclusions from the Kemeny report on the accident at TMI; some recommendations from the Kemeny report on the accident at TMI; conclusions and recommendations from the Rogovin report on the accident on TMI; instrumentation deficiencies (from Rogovin report).

  2. Health care marketing: Basic features

    Gajić-Stevanović Milena

    2006-01-01

    Paper discuss an introduction to importance's as well as challenges facing health care sector in many countries. Particular attention is devoted to the preconditions and/or basic requirements have to be developed in order to make health sector to functioned. Focusing to end users as well as employing marketing tools ought to be right orientation.

  3. Photodynamic Therapy (PDT) - Basic Principles

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 4. Photodynamic Therapy (PDT) - Basic Principles. Bhaskar G Maiya. Series Article Volume 5 Issue 4 April 2000 pp 6-18. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/005/04/0006-0018 ...

  4. Basic Income on the Agenda

    Groot, Loek; Veen, van der Robert-Jan

    2000-01-01

    Persisting unemployment, poverty and social exclusion, labour market flexibility, job insecurity and higher wage inequality, changing patterns of work and family life are among the factors that exert pressure on welfare states in Europe. This book explores the potential of an unconditional basic

  5. Basic standards for radiation protection

    Webb, G.A.M.

    1982-01-01

    The basic standards for radiation protection have been based, for many years, on the recommendations of the International Commission of Radiological Protection. The three basic standards recommended by the Commission may be summarized as ''justification, optimization of protection and adherence to dose limitations. The applications of these basic principles to different aspects of protection are briefly summarized and the particular ways in which they have been applied to waste described in more detail. The application of dose limits, both in the control of occupational exposure and in regulating routine discharges of radioactive effluents is straight forward in principle although the measurement and calculational requirements may be substantial. Secondary standards such as derived limits may be extremely useful and the principles underlying their derivation will be described. Optimization of protection is inherently a more difficult concept to apply in protection and the various techniques used will be outlined by with particular emphasis on the use of cost benefit analysis are recommended by the ICRP. A review will be given of the problems involved in extending these basic concepts of the ICRP to probabilistic analyses such as those required for assessing the consequences of accidents or disruptive events in long term repositories. The particular difficulties posed by the very long timescales involved in the assessment of waste management practices will be discussed in some detail. (orig./RW)

  6. Basic safety principles: Lessons learned

    Erp, J.B. van

    1997-01-01

    The presentation reviews the following issues: basic safety principles and lessons learned; some conclusions from the Kemeny report on the accident at TMI; some recommendations from the Kemeny report on the accident at TMI; conclusions and recommendations from the Rogovin report on the accident on TMI; instrumentation deficiencies (from Rogovin report)

  7. Basic semantics of product sounds

    Özcan Vieira, E.; Van Egmond, R.

    2012-01-01

    Product experience is a result of sensory and semantic experiences with product properties. In this paper, we focus on the semantic attributes of product sounds and explore the basic components for product sound related semantics using a semantic differential paradigmand factor analysis. With two

  8. Getting Back to Basics (& Acidics)

    Rhodes, Sam

    2006-01-01

    This article describes a few novel acid-base experiments intended to introduce students to the basic concepts of acid-base chemistry and provide practical examples that apply directly to the study of biology and the human body. Important concepts such as the reaction between carbon dioxide and water, buffers and protein denaturation, are covered.…

  9. Breast Cancer Basics and You

    ... page please turn JavaScript on. Feature: Screening For Breast Cancer Breast Cancer Basics and You Past Issues / Summer 2014 Table ... more than 232,670 new cases of female breast cancer in the United States in 2014. More than ...

  10. Computer Assisted Instruction in Basic.

    1983-09-28

    LIBRARY........................16 Program Purpose.........................16 Flowcharts ..........................17 Lessons...17IFlowchart For Main Menu...............19 Flowchart for Lessons One Through Six......................20 CHAPTER Page Tests I1-6 .* 21 Flowchart For...Software support was limited to off-the-shelf packages. All of the computers were purchased with Beginners All Purpose Instruction Code (BASIC), a word

  11. Prediction of customer behaviour through datamining assets

    Naděžda Chalupová

    2009-01-01

    Full Text Available Business managers accounting for commercial success or non-success of the organization have to gain knowledge needful for correct decision acceptance. These knowledge represent sophisticated information hidden in enterprise data. One possibility, how to extract mentioned knowledge from data, is to use so-called datamining assets.The paper deals with an application of chosen basic methods of knowledge discovering in da­ta­ba­ses for area of customer-provider relation and it presents, how to avail acquired knowledge as basis of managerial decisions leading to improving of customer relationship management. It solves prediction, whose aim is, on the basis of some attributes of exploring objects, to predict future be­ha­viour of objects with these attributes. This way acquired knowledge, as the output of prediction, then can markedly help competent enterprise manager with planning of marketing strategies, for example so-called cross-selling and up-selling. The contribution describes a whole operation of available data processing: from its purifying, over its preparation for mining task, to self processing by the help of SAS Enterprise Miner tool. Regression analysis, neural network and decision tree, whose principles are briefly explained in this paper too, were used for knowledge mining. The estimation of customer behaviour was tested by two mining task varying in attribute using and in categories number of one of predicive attributes. The results of these two tasks are confronted by the help of prediction fruitfulness charts.

  12. Evaluation of disorder predictions in CASP9

    Monastyrskyy, Bohdan

    2011-01-01

    Lack of stable three-dimensional structure, or intrinsic disorder, is a common phenomenon in proteins. Naturally, unstructured regions are proven to be essential for carrying function by many proteins, and therefore identification of such regions is an important issue. CASP has been assessing the state of the art in predicting disorder regions from amino acid sequence since 2002. Here, we present the results of the evaluation of the disorder predictions submitted to CASP9. The assessment is based on the evaluation measures and procedures used in previous CASPs. The balanced accuracy and the Matthews correlation coefficient were chosen as basic measures for evaluating the correctness of binary classifications. The area under the receiver operating characteristic curve was the measure of choice for evaluating probability-based predictions of disorder. The CASP9 methods are shown to perform slightly better than the CASP7 methods but not better than the methods in CASP8. It was also shown that capability of most CASP9 methods to predict disorder decreases with increasing minimum disorder segment length.

  13. Magnetic measurements of the correction and adjustment magnets of the main ring

    Trbojevic, D.

    1986-07-01

    Correction magnets correct the field imperfections and alignment errors of the main quadrupole and bend magnets. For reducing and controlling chromaticity there are 186 sextupoles and 78 octupoles, while for suppressing various resonances there are 12 normal and 18 skew sextupoles and 24 normal and 19 skew quadrupoles. Beam positions are individually controlled by 108 horizontal and 108 skew dipoles. This report includes results of the all Main Ring correction and adjustment magnet harmonic measurements. The measurement principle and basic equations are described

  14. Task Refinement for Autonomous Robots using Complementary Corrective Human Feedback

    Cetin Mericli

    2011-06-01

    Full Text Available A robot can perform a given task through a policy that maps its sensed state to appropriate actions. We assume that a hand-coded controller can achieve such a mapping only for the basic cases of the task. Refining the controller becomes harder and gets more tedious and error prone as the complexity of the task increases. In this paper, we present a new learning from demonstration approach to improve the robot's performance through the use of corrective human feedback as a complement to an existing hand-coded algorithm. The human teacher observes the robot as it performs the task using the hand-coded algorithm and takes over the control to correct the behavior when the robot selects a wrong action to be executed. Corrections are captured as new state-action pairs and the default controller output is replaced by the demonstrated corrections during autonomous execution when the current state of the robot is decided to be similar to a previously corrected state in the correction database. The proposed approach is applied to a complex ball dribbling task performed against stationary defender robots in a robot soccer scenario, where physical Aldebaran Nao humanoid robots are used. The results of our experiments show an improvement in the robot's performance when the default hand-coded controller is augmented with corrective human demonstration.

  15. Correct acceptance weighs more than correct rejection: a decision bias induced by question framing.

    Kareev, Yaakov; Trope, Yaacov

    2011-02-01

    We propose that in attempting to detect whether an effect exists or not, people set their decision criterion so as to increase the number of hits and decrease the number of misses, at the cost of increasing false alarms and decreasing correct rejections. As a result, we argue, if one of two complementary events is framed as the positive response to a question and the other as the negative response, people will tend to predict the former more often than the latter. Performance in a prediction task with symmetric payoffs and equal base rates supported our proposal. Positive responses were indeed more prevalent than negative responses, irrespective of the phrasing of the question. The bias, slight but consistent and significant, was evident from early in a session and then remained unchanged to the end. A regression analysis revealed that, in addition, individuals' decision criteria reflected their learning experiences, with the weight of hits being greater than that of correct rejections.

  16. Unpacking Corrections in Mobile Instruction

    Levin, Lena; Cromdal, Jakob; Broth, Mathias

    2017-01-01

    that the practice of unpacking the local particulars of corrections (i) provides for the instructional character of the interaction, and (ii) is highly sensitive to the relevant physical and mobile contingencies. These findings contribute to the existing literature on the interactional organisation of correction...

  17. Atmospheric correction of satellite data

    Shmirko, Konstantin; Bobrikov, Alexey; Pavlov, Andrey

    2015-11-01

    Atmosphere responses for more than 90% of all radiation measured by satellite. Due to this, atmospheric correction plays an important role in separating water leaving radiance from the signal, evaluating concentration of various water pigments (chlorophyll-A, DOM, CDOM, etc). The elimination of atmospheric intrinsic radiance from remote sensing signal referred to as atmospheric correction.

  18. Stress Management in Correctional Recreation.

    Card, Jaclyn A.

    Current economic conditions have created additional sources of stress in the correctional setting. Often, recreation professionals employed in these settings also add to inmate stress. One of the major factors limiting stress management in correctional settings is a lack of understanding of the value, importance, and perceived freedom, of leisure.…

  19. Wall correction model for wind tunnels with open test section

    Sørensen, Jens Nørkær; Shen, Wen Zhong; Mikkelsen, Robert Flemming

    2006-01-01

    In the paper we present a correction model for wall interference on rotors of wind turbines or propellers in wind tunnels. The model, which is based on a one-dimensional momentum approach, is validated against results from CFD computations using a generalized actuator disc principle. In the model...... good agreement with the CFD computations, demonstrating that one-dimensional momentum theory is a reliable way of predicting corrections for wall interference in wind tunnels with closed as well as open cross sections....

  20. Software Package for Optics Measurement and Correction in the LHC

    Aiba, M; Tomas, R; Vanbavinckhove, G

    2010-01-01

    A software package has been developed for the LHC on-line optics measurement and correction. This package includes several different algorithms to measure phase advance, beta functions, dispersion, coupling parameters and even some non-linear terms. A Graphical User Interface provides visualization tools to compare measurements to model predictions, fit analytical formula, localize error sources and compute and send corrections to the hardware.

  1. Basic Science for a Secure Energy Future

    Horton, Linda

    2010-03-01

    Anticipating a doubling in the world's energy use by the year 2050 coupled with an increasing focus on clean energy technologies, there is a national imperative for new energy technologies and improved energy efficiency. The Department of Energy's Office of Basic Energy Sciences (BES) supports fundamental research that provides the foundations for new energy technologies and supports DOE missions in energy, environment, and national security. The research crosses the full spectrum of materials and chemical sciences, as well as aspects of biosciences and geosciences, with a focus on understanding, predicting, and ultimately controlling matter and energy at electronic, atomic, and molecular levels. In addition, BES is the home for national user facilities for x-ray, neutron, nanoscale sciences, and electron beam characterization that serve over 10,000 users annually. To provide a strategic focus for these programs, BES has held a series of ``Basic Research Needs'' workshops on a number of energy topics over the past 6 years. These workshops have defined a number of research priorities in areas related to renewable, fossil, and nuclear energy -- as well as cross-cutting scientific grand challenges. These directions have helped to define the research for the recently established Energy Frontier Research Centers (EFRCs) and are foundational for the newly announced Energy Innovation Hubs. This overview will review the current BES research portfolio, including the EFRCs and user facilities, will highlight past research that has had an impact on energy technologies, and will discuss future directions as defined through the BES workshops and research opportunities.

  2. Error-correction coding for digital communications

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  3. Transport Task Force workshop: basic experiments highlights

    Linford, R.K. (Los Alamos National Lab., NM (USA)); Luckhardt, S. (Massachusetts Inst. of Tech., Cambridge, MA (USA)); Lyon, J.F. (Oak Ridge National Lab., TN (USA)); Navratil, G.A. (Columbia Univ., New York, NY (USA)); Schoenberg, K.F. (Los Alamos National Lab., NM (USA))

    1990-01-01

    Selected topics are summarized from the Basic Experiments session of the Transport Task Force Workshop held August 21-24, 1989, in San Diego, California. This session included presentations on paradigm experiments, stellarators, reversed-field pinches, and advanced tokamaks. Recent advances in all of these areas illustrate the importance of these experiments in advancing our understanding of toroidal transport. Progress has been made in measuring the details of particle diffusion, isolating specific modes, measuring fluctuation variations with field geometry and beta, and comparing all these with theoretical predictions. The development of experimental tools for determining which fluctuations dominate transport are also reported. Continued significant advances are anticipated in a number of areas highlighted. (author).

  4. Transport Task Force workshop: basic experiments highlights

    Linford, R.K.; Luckhardt, S.; Lyon, J.F.; Navratil, G.A.; Schoenberg, K.F.

    1990-01-01

    Selected topics are summarized from the Basic Experiments session of the Transport Task Force Workshop held August 21-24, 1989, in San Diego, California. This session included presentations on paradigm experiments, stellarators, reversed-field pinches, and advanced tokamaks. Recent advances in all of these areas illustrate the importance of these experiments in advancing our understanding of toroidal transport. Progress has been made in measuring the details of particle diffusion, isolating specific modes, measuring fluctuation variations with field geometry and beta, and comparing all these with theoretical predictions. The development of experimental tools for determining which fluctuations dominate transport are also reported. Continued significant advances are anticipated in a number of areas highlighted. (author)

  5. Basics of modern mathematical statistics

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  6. Basic concepts in computational physics

    Stickler, Benjamin A

    2016-01-01

    This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the read...

  7. Basic metallurgy for nondestructive testing

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    For this chapter, reader will be served with the basic knowledge on metallurgy for nondestructive testing. One the main application of nondestructive testing is to detect discontinuity of mass defect in metal. As we already know, metal are widely used in many application such as in building as a system, component and engineering product. Steel and iron are metal that usually used in industry, especially heavy industry such as gas and petroleum industry, chemistry, electric generation, automobile, and military device. Based on this, basic knowledge on metallurgy must need by NDT practitioner. The combination between metallurgy and datas from radiography testing can make radiographer good interpretation on quality of the metal inspected and can used to make a good decision either to accept or not certain product, system or components.

  8. Magnetic resonance imaging the basics

    Constantinides, Christakis

    2014-01-01

    Magnetic resonance imaging (MRI) is a rapidly developing field in basic applied science and clinical practice. Research efforts in this area have already been recognized with five Nobel prizes awarded to seven Nobel laureates in the past 70 years. Based on courses taught at The Johns Hopkins University, Magnetic Resonance Imaging: The Basics provides a solid introduction to this powerful technology. The book begins with a general description of the phenomenon of magnetic resonance and a brief summary of Fourier transformations in two dimensions. It examines the fundamental principles of physics for nuclear magnetic resonance (NMR) signal formation and image construction and provides a detailed explanation of the mathematical formulation of MRI. Numerous image quantitative indices are discussed, including (among others) signal, noise, signal-to-noise, contrast, and resolution. The second part of the book examines the hardware and electronics of an MRI scanner and the typical measurements and simulations of m...

  9. Positron emission tomography basic sciences

    Townsend, D W; Valk, P E; Maisey, M N

    2003-01-01

    Essential for students, science and medical graduates who want to understand the basic science of Positron Emission Tomography (PET), this book describes the physics, chemistry, technology and overview of the clinical uses behind the science of PET and the imaging techniques it uses. In recent years, PET has moved from high-end research imaging tool used by the highly specialized to an essential component of clinical evaluation in the clinic, especially in cancer management. Previously being the realm of scientists, this book explains PET instrumentation, radiochemistry, PET data acquisition and image formation, integration of structural and functional images, radiation dosimetry and protection, and applications in dedicated areas such as drug development, oncology, and gene expression imaging. The technologist, the science, engineering or chemistry graduate seeking further detailed information about PET, or the medical advanced trainee wishing to gain insight into the basic science of PET will find this book...

  10. Health insurance basic actuarial models

    Pitacco, Ermanno

    2014-01-01

    Health Insurance aims at filling a gap in actuarial literature, attempting to solve the frequent misunderstanding in regards to both the purpose and the contents of health insurance products (and ‘protection products’, more generally) on the one hand, and the relevant actuarial structures on the other. In order to cover the basic principles regarding health insurance techniques, the first few chapters in this book are mainly devoted to the need for health insurance and a description of insurance products in this area (sickness insurance, accident insurance, critical illness covers, income protection, long-term care insurance, health-related benefits as riders to life insurance policies). An introduction to general actuarial and risk-management issues follows. Basic actuarial models are presented for sickness insurance and income protection (i.e. disability annuities). Several numerical examples help the reader understand the main features of pricing and reserving in the health insurance area. A short int...

  11. HMPT: Basic Radioactive Material Transportation

    Hypes, Philip A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-29

    Hazardous Materials and Packaging and Transportation (HMPT): Basic Radioactive Material Transportation Live (#30462, suggested one time) and Test (#30463, required initially and every 36 months) address the Department of Transportation’s (DOT’s) function-specific [required for hazardous material (HAZMAT) handlers, packagers, and shippers] training requirements of the HMPT Los Alamos National Laboratory (LANL) Labwide training. This course meets the requirements of 49 CFR 172, Subpart H, Section 172.704(a)(ii), Function-Specific Training.

  12. Basic Principles of Wastewater Treatment

    Von Sperling, Marcos

    2007-01-01

    "Basic Principles of Wastewater Treatment is the second volume in the series Biological Wastewater Treatment, and focusses on the unit operations and processes associated with biological wastewater treatment. The major topics covered are: microbiology and ecology of wastewater treatment reaction kinetics and reactor hydraulics conversion of organic and inorganic matter sedimentation aeration The theory presented in this volume forms the basis upon which the other books...

  13. Positron emission tomography. Basic principles

    Rodriguez, Jose Luis; Massardo, Teresa; Gonzalez, Patricio

    2001-01-01

    The basic principles of positron emission tomography (PET) technique are reviewed. lt allows to obtain functional images from gamma rays produced by annihilation of a positron, a positive beta particle. This paper analyzes positron emitters production in a cyclotron, its general mechanisms, and the various detection systems. The most important clinical applications are also mentioned, related to oncological uses of fluor-l8-deoxyglucose

  14. Basic statistics for social research

    Hanneman, Robert A; Riddle, Mark D

    2012-01-01

    A core statistics text that emphasizes logical inquiry, notmath Basic Statistics for Social Research teaches core generalstatistical concepts and methods that all social science majorsmust master to understand (and do) social research. Its use ofmathematics and theory are deliberately limited, as the authorsfocus on the use of concepts and tools of statistics in theanalysis of social science data, rather than on the mathematicaland computational aspects. Research questions and applications aretaken from a wide variety of subfields in sociology, and eachchapter is organized arou

  15. A basic guide to investing.

    Smith, Michael C

    2006-03-01

    Today's investors have many choices. From seemingly simple investments, such as stocks, bonds and cash, to more complicated option strategies, there is a dizzying array of investment vehicles that can leave even the most seasoned investor a bit confused. In discussions with our clients, one common thread is the desire to learn more about the various types of investments available. Following is a basic guide to the most common investments and the risks inherent in those choices.

  16. Basic Military Justice Handbook. Revision

    1989-01-01

    unmistakable odor of burning marijuana outside the accused’s barracks room, acted correctly when he demanded entry to the room and placed all occupants under...conspirator in furtherance of the conspiracy to be the act of all the conspirators. Suppose, therefore, that A and B agree to burn down the Naval Justice...September 19CY, without authority, fail to go at the time precribed to his appointed place of duty, to wit: the 0600 restricted muster on the fantail. (2

  17. BLAS- BASIC LINEAR ALGEBRA SUBPROGRAMS

    Krogh, F. T.

    1994-01-01

    The Basic Linear Algebra Subprogram (BLAS) library is a collection of FORTRAN callable routines for employing standard techniques in performing the basic operations of numerical linear algebra. The BLAS library was developed to provide a portable and efficient source of basic operations for designers of programs involving linear algebraic computations. The subprograms available in the library cover the operations of dot product, multiplication of a scalar and a vector, vector plus a scalar times a vector, Givens transformation, modified Givens transformation, copy, swap, Euclidean norm, sum of magnitudes, and location of the largest magnitude element. Since these subprograms are to be used in an ANSI FORTRAN context, the cases of single precision, double precision, and complex data are provided for. All of the subprograms have been thoroughly tested and produce consistent results even when transported from machine to machine. BLAS contains Assembler versions and FORTRAN test code for any of the following compilers: Lahey F77L, Microsoft FORTRAN, or IBM Professional FORTRAN. It requires the Microsoft Macro Assembler and a math co-processor. The PC implementation allows individual arrays of over 64K. The BLAS library was developed in 1979. The PC version was made available in 1986 and updated in 1988.

  18. Hurdles in Basic Science Translation

    Christina J. Perry

    2017-07-01

    Full Text Available In the past century there have been incredible advances in the field of medical research, but what hinders translation of this knowledge into effective treatment for human disease? There is an increasing focus on the failure of many research breakthroughs to be translated through the clinical trial process and into medical practice. In this mini review, we will consider some of the reasons that findings in basic medical research fail to become translated through clinical trials and into basic medical practices. We focus in particular on the way that human disease is modeled, the understanding we have of how our targets behave in vivo, and also some of the issues surrounding reproducibility of basic research findings. We will also look at some of the ways that have been proposed for overcoming these issues. It appears that there needs to be a cultural shift in the way we fund, publish and recognize quality control in scientific research. Although this is a daunting proposition, we hope that with increasing awareness and focus on research translation and the hurdles that impede it, the field of medical research will continue to inform and improve medical practice across the world.

  19. Research into basic rocks types

    1993-06-01

    Teollisuuden Voima Oy (TVO) has carried out research into basic rock types in Finland. The research programme has been implemented in parallel with the preliminary site investigations for radioactive waste disposal in 1991-1993. The program contained two main objectives: firstly, to study the properties of the basic rock types and compare those with the other rock types under the investigation; secondly, to carry out an inventory of rock formations consisting of basic rock types and suitable in question for final disposal. A study of environmental factors important to know regarding the final disposal was made of formations identified. In total 159 formations exceeding the size of 4 km 2 were identified in the inventory. Of these formations 97 were intrusive igneous rock types and 62 originally extrusive volcanic rock types. Deposits consisting of ore minerals, industrial minerals or building stones related to these formations were studied. Environmental factors like natural resources, protected areas or potential for restrictions in land use were also studied

  20. The complete NLO corrections to dijet hadroproduction

    Frederix, R.; Frixione, S.; Hirschi, V.; Pagani, D.; Shao, H.-S.; Zaro, M.

    2017-04-01

    We study the production of jets in hadronic collisions, by computing all contributions proportional to α S n α m , with n + m = 2 and n + m = 3. These correspond to leading and next-to-leading order results, respectively, for single-inclusive and dijet observables in a perturbative expansion that includes both QCD and electroweak effects. We discuss issues relevant to the definition of hadronic jets in the context of electroweak corrections, and present sample phenomenological predictions for the 13-TeV LHC. We find that both the leading and next-to-leading order contributions largely respect the relative hierarchy established by the respective coupling-constant combinations.

  1. Basic visual dysfunction allows classification of patients with schizophrenia with exceptional accuracy.

    González-Hernández, J A; Pita-Alcorta, C; Padrón, A; Finalé, A; Galán, L; Martínez, E; Díaz-Comas, L; Samper-González, J A; Lencer, R; Marot, M

    2014-10-01

    Basic visual dysfunctions are commonly reported in schizophrenia; however their value as diagnostic tools remains uncertain. This study reports a novel electrophysiological approach using checkerboard visual evoked potentials (VEP). Sources of spectral resolution VEP-components C1, P1 and N1 were estimated by LORETA, and the band-effects (BSE) on these estimated sources were explored in each subject. BSEs were Z-transformed for each component and relationships with clinical variables were assessed. Clinical effects were evaluated by ROC-curves and predictive values. Forty-eight patients with schizophrenia (SZ) and 55 healthy controls participated in the study. For each of the 48 patients, the three VEP components were localized to both dorsal and ventral brain areas and also deviated from a normal distribution. P1 and N1 deviations were independent of treatment, illness chronicity or gender. Results from LORETA also suggest that deficits in thalamus, posterior cingulum, precuneus, superior parietal and medial occipitotemporal areas were associated with symptom severity. While positive symptoms were more strongly related to sensory processing deficits (P1), negative symptoms were more strongly related to perceptual processing dysfunction (N1). Clinical validation revealed positive and negative predictive values for correctly classifying SZ of 100% and 77%, respectively. Classification in an additional independent sample of 30 SZ corroborated these results. In summary, this novel approach revealed basic visual dysfunctions in all patients with schizophrenia, suggesting these visual dysfunctions represent a promising candidate as a biomarker for schizophrenia. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Basic Color Terms in Estonian Sign Language

    Hollman, Liivi; Sutrop, Urmas

    2011-01-01

    The article is written in the tradition of Brent Berlin and Paul Kay's theory of basic color terms. According to this theory there is a universal inventory of eleven basic color categories from which the basic color terms of any given language are always drawn. The number of basic color terms varies from 2 to 11 and in a language having a fully…

  3. Corrective action program at Krsko NPP

    Skaler, F; Divjak, G; Kavsek, D [NPP Krsko, Krsko (Slovenia)

    2004-07-01

    The Krsko NPP develops software that enables electronic reporting of all kind of deviations and suggestions for improvement at the plant. All the employees and permanent subcontractors have the access to the system and can report deviations. NPP has centralized decision process for the distribution of reported deviation. At this point all direct actions are electronically tracked. The immediate benefits of this new tool were: Reporting threshold has been lowered; Number of reporting people has increased; One computerized form for all processes; Decision, which process will solve the deviation, is centralized; All types of deviation are in the same environment; Our experiences of the processes are incorporated in the program; Control of work that has been done; Archiving is electronic only. Software basic data: Application system Corrective action program is a WEB application. Data is stored in Oracle 8.1.7 i database. Users access application through PL/SQL gateway on Oracle 9i Application Server 1.0.2. using Microsoft Internet Explorer browsers(Version 5 or later). Reports are implemented by Oracle Reports 6i. Menus are designed by Apycom Java Menus and Buttons v4.23. Our Presentation will include: Basic idea; Implementation change management; Demonstration of the program.(author)

  4. Corrective action program at Krsko NPP

    Skaler, F.; Divjak, G.; Kavsek, D.

    2004-01-01

    The Krsko NPP develops software that enables electronic reporting of all kind of deviations and suggestions for improvement at the plant. All the employees and permanent subcontractors have the access to the system and can report deviations. NPP has centralized decision process for the distribution of reported deviation. At this point all direct actions are electronically tracked. The immediate benefits of this new tool were: Reporting threshold has been lowered; Number of reporting people has increased; One computerized form for all processes; Decision, which process will solve the deviation, is centralized; All types of deviation are in the same environment; Our experiences of the processes are incorporated in the program; Control of work that has been done; Archiving is electronic only. Software basic data: Application system Corrective action program is a WEB application. Data is stored in Oracle 8.1.7 i database. Users access application through PL/SQL gateway on Oracle 9i Application Server 1.0.2. using Microsoft Internet Explorer browsers(Version 5 or later). Reports are implemented by Oracle Reports 6i. Menus are designed by Apycom Java Menus and Buttons v4.23. Our Presentation will include: Basic idea; Implementation change management; Demonstration of the program.(author)

  5. Correction to Kane et al. (2016).

    Kane, Michael J; Meier, Matt E; Smeekens, Bridget A; Gross, Georgina M; Chun, Charlotte A; Silvia, Paul J; Kwapil, Thomas R

    2016-12-01

    Reports an error in "Individual differences in the executive control of attention, memory, and thought, and their associations with schizotypy" by Michael J. Kane, Matt E. Meier, Bridget A. Smeekens, Georgina M. Gross, Charlotte A. Chun, Paul J. Silvia and Thomas R. Kwapil ( Journal of Experimental Psychology: General , 2016[Aug], Vol 145[8], 1017-1048). There were errors in Table 3 and Table 7 (these transcription errors were limited to descriptive statistics in the Tables and did not affect any inferential statistics). In Table 3, the ARRO-TUT and LETT-TUT variables had incorrect values for Mean [95% CI], SD, Skew, Kurtosis, and N. In Table 7, the same values (plus Min and Max) were incorrect for the SEM-SART variable. The correct values for these measures are presented in the correction (the values for Min and Max were correct as set in Table 3, but are repeated below for clarity). (The following abstract of the original article appeared in record 2016-29680-001.) A large correlational study took a latent-variable approach to the generality of executive control by testing the individual-differences structure of executive-attention capabilities and assessing their prediction of schizotypy, a multidimensional construct (with negative, positive, disorganized, and paranoid factors) conveying risk for schizophrenia. Although schizophrenia is convincingly linked to executive deficits, the schizotypy literature is equivocal. Subjects completed tasks of working memory capacity (WMC), attention restraint (inhibiting prepotent responses), and attention constraint (focusing visual attention amid distractors), the latter 2 in an effort to fractionate the "inhibition" construct. We also assessed mind-wandering propensity (via in-task thought probes) and coefficient of variation in response times (RT CoV) from several tasks as more novel indices of executive attention. WMC, attention restraint, attention constraint, mind wandering, and RT CoV were correlated but separable

  6. Agronomic Use of Basic Slag

    Fabio Oliveiri de Nobile

    2015-01-01

    Full Text Available Modern civilization, in recent years, has increased the requirement of products derived from iron and steel, stimulating the growth of the national siderurgical sector and, consequently, the generation of industrial residue called basic slag. In this context, the recycling of residues can contribute to solve problems of the industries that give priority to the excellence of the production with quality. On the other hand, there is a sector of primary production in Brazil, the agriculture, with a great cultivated area in acid ground and with low fertility, being these factors admittedly determinative for vegetal production, under tropical conditions. Thus, there is a scenery of two primary sectors of production, although distinct ones, that present interaction potential, for , on one hand, there is disponibility of a product with similar properties to the liming materials and traditional fertilizers and, on the other hand, a production sector that is highly dependent of these products. And the interaction between these two sectors helps in the preservation of the environment, bringing, thus, a certain sustainability in the production systems of the postmodern civilization that will be the challenge of this new century. Considering the current possibility of recycling these industrial residues in agriculture, three important factors have to be taken into account. The first would be the proper use of the abundant, available and promising industrial residue; the second, in a propitious agricultural environment, acid soil and low fertility; and third, in a responsive and important socio-economic culture, the sugar cane, considering its vast cultivated area. In national literature, few works have dealt with the use of the basic slag and have evaluated the reply of the cultures to its application. Thus, the present work had as its aim to gather information from literature concerning the characterization and production of basic slag in Brazil, as well

  7. Does correcting astigmatism with toric lenses improve driving performance?

    Cox, Daniel J; Banton, Thomas; Record, Steven; Grabman, Jesse H; Hawkins, Ronald J

    2015-04-01

    Driving is a vision-based activity of daily living that impacts safety. Because visual disruption can compromise driving safety, contact lens wearers with astigmatism may pose a driving safety risk if they experience residual blur from spherical lenses that do not correct their astigmatism or if they experience blur from toric lenses that rotate excessively. Given that toric lens stabilization systems are continually improving, this preliminary study tested the hypothesis that astigmats wearing toric contact lenses, compared with spherical lenses, would exhibit better overall driving performance and driving-specific visual abilities. A within-subject, single-blind, crossover, randomized design was used to evaluate driving performance in 11 young adults with astigmatism (-0.75 to -1.75 diopters cylinder). Each participant drove a highly immersive, virtual reality driving simulator (210 degrees field of view) with (1) no correction, (2) spherical contact lens correction (ACUVUE MOIST), and (3) toric contact lens correction (ACUVUE MOIST for Astigmatism). Tactical driving skills such as steering, speed management, and braking, as well as operational driving abilities such as visual acuity, contrast sensitivity, and foot and arm reaction time, were quantified. There was a main effect for type of correction on driving performance (p = 0.05). Correction with toric lenses resulted in significantly safer tactical driving performance than no correction (p driving safety from no correction (p = 0.118). Operational tests differentiated corrected from uncorrected performance for both spherical (p = 0.008) and toric (p = 0.011) lenses, but they were not sensitive enough to differentiate toric from spherical lens conditions. Given previous research showing that deficits in these tactical skills are predictive of future real-world collisions, these preliminary data suggest that correcting low to moderate astigmatism with toric lenses may be important to driving safety. Their

  8. Paediatric airway management: basic aspects

    Holm-Knudsen, R J; Rasmussen, L S

    2009-01-01

    Paediatric airway management is a great challenge, especially for anaesthesiologists working in departments with a low number of paediatric surgical procedures. The paediatric airway is substantially different from the adult airway and obstruction leads to rapid desaturation in infants and small...... children. This paper aims at providing the non-paediatric anaesthesiologist with a set of safe and simple principles for basic paediatric airway management. In contrast to adults, most children with difficult airways are recognised before induction of anaesthesia but problems may arise in all children...

  9. Basic Elements of Knowledge Management

    Marcin W. Staniewski

    2007-10-01

    Full Text Available The article is a review of basic knowledge management terminology. It presents such a description as: knowledge resources levels (data, information, knowledge, and wisdom, knowledge sources (internal, external, and knowledge typology (implicit, tacit or individual, social. Moreover the article characterizes knowledge management process, knowledge management system and main knowledge management strategies (codification, personalization. At the end of the article there is mentioned the knowledge creating process (the concept of knowledge creation spiral and the role of Intelligence Technology (IT and organizational culture as main elements supporting knowledge management implementation in organizations.

  10. Sound Symbolism in Basic Vocabulary

    Søren Wichmann

    2010-04-01

    Full Text Available The relationship between meanings of words and their sound shapes is to a large extent arbitrary, but it is well known that languages exhibit sound symbolism effects violating arbitrariness. Evidence for sound symbolism is typically anecdotal, however. Here we present a systematic approach. Using a selection of basic vocabulary in nearly one half of the world’s languages we find commonalities among sound shapes for words referring to same concepts. These are interpreted as due to sound symbolism. Studying the effects of sound symbolism cross-linguistically is of key importance for the understanding of language evolution.

  11. Security basics for computer architects

    Lee, Ruby B

    2013-01-01

    Design for security is an essential aspect of the design of future computers. However, security is not well understood by the computer architecture community. Many important security aspects have evolved over the last several decades in the cryptography, operating systems, and networking communities. This book attempts to introduce the computer architecture student, researcher, or practitioner to the basic concepts of security and threat-based design. Past work in different security communities can inform our thinking and provide a rich set of technologies for building architectural support fo

  12. Combustion from basics to applications

    Lackner, Maximilian; Winter, Franz

    2013-01-01

    Combustion, the process of burning, is defined as a chemical reaction between a combustible reactant (the fuel) and an oxidizing agent (such as air) in order to produce heat and in most cases light while new chemical species (e.g., flue gas components) are formed. This book covers a gap on the market by providing a concise introduction to combustion. Most of the other books currently available are targeted towards the experienced users and contain too many details and/or contain knowledge at a fairly high level. This book provides a brief and clear overview of the combustion basics, suitable f

  13. Administration of medicines. Midwifery basics.

    Baston, Helen

    2002-04-01

    Midwifery Basics is a series of articles that cover the main clinical skills underpinning midwifery practice. The series uses National Occupational Standards (Care Sector Consortium 1998) as a framework to identify the areas of competence that students need to achieve in order to master clinical skills. This format is combined with the use of 'triggers' to prompt the student to identify what she needs to know in order to care for a client in such a situation. The information that follows then enables the student to fill in the gaps in her knowledge.

  14. Basic concepts of materials accounting

    Markin, J.T.

    1989-01-01

    The importance of accounting for nuclear materials to the efficient, safe, and economical operation of nuclear facilities is introduced, and the following topics are covered: material balance equation; item control areas; material balance uncertainty; decision procedures for materials accounting; conventional and near-real-time accounting; regulatory requirements of the US Department of Energy and the Nuclear Regulatory Commission; and a summary related to the development of a materials accounting system to implement the basic concepts described. The summary includes a section on each of the following: problem definition, system objectives, and system design

  15. Basic optics of effect materials.

    Jones, Steven A

    2010-01-01

    Effect materials derive their color and effect primarily from thin-film interference. Effect materials have evolved over the decades from simple guanine crystals to the complex multilayer optical structures of today. The development of new complex effect materials requires an understanding of the optics of effect materials. Such an understanding would also benefit the cosmetic formulator as these new effect materials are introduced. The root of this understanding begins with basic optics. This paper covers the nature of light, interference of waves, thin-film interference, color from interference, and color travel.

  16. Basic reactions induced by radiation

    Charlesby, A.

    1980-01-01

    This paper summarises some of the basic reactions resulting from exposure to high energy radiation. In the initial stages energy is absorbed, but not necessarily at random, giving radical and ion species which may then react to promote the final chemical change. However, it is possible to intervene at intermediate stages to modify or reduce the radiation effect. Under certain conditions enhanced reactions are also possible. Several expressions are given to calculate radiation yield in terms of energy absorbed. Some analogies between radiation-induced reactions in polymers, and those studied in radiobiology are outlined. (author)

  17. Basic radiotherapy physics and biology

    Chang, David S; Das, Indra J; Mendonca, Marc S; Dynlacht, Joseph R

    2014-01-01

    This book is a concise and well-illustrated review of the physics and biology of radiation therapy intended for radiation oncology residents, radiation therapists, dosimetrists, and physicists. It presents topics that are included on the Radiation Therapy Physics and Biology examinations and is designed with the intent of presenting information in an easily digestible format with maximum retention in mind. The inclusion of mnemonics, rules of thumb, and reader-friendly illustrations throughout the book help to make difficult concepts easier to grasp. Basic Radiotherapy Physics and Biology is a

  18. Basic photovoltaic principles and methods

    Hersch, P.; Zweibel, K.

    1982-02-01

    This book presents a nonmathematical explanation of the theory and design of photovoltaic (PV) solar cells and systems. The basic elements of PV are introduced: the photovoltaic effect, physical aspects of solar cell efficiency, the typical single-crystal silicon solar cell, advances in single-crystal silicon solar cells. This is followed by the designs of systems constructed from individual cells, including possible constructions for putting cells together and the equipment needed for a practical producer of electrical energy. The future of PV is then discussed. (LEW)

  19. Automatic computation of radiative corrections

    Fujimoto, J.; Ishikawa, T.; Shimizu, Y.; Kato, K.; Nakazawa, N.; Kaneko, T.

    1997-01-01

    Automated systems are reviewed focusing on their general structure and requirement specific to the calculation of radiative corrections. Detailed description of the system and its performance is presented taking GRACE as a concrete example. (author)

  20. Publisher Correction: On our bookshelf

    Karouzos, Marios

    2018-03-01

    In the version of this Books and Arts originally published, the book title Spectroscopy for Amateur Astronomy was incorrect; it should have read Spectroscopy for Amateur Astronomers. This has now been corrected.