WorldWideScience

Sample records for actuarial analysis

  1. Impact of actuarial assumptions on pension costs: A simulation analysis

    Science.gov (United States)

    Yusof, Shaira; Ibrahim, Rose Irnawaty

    2013-04-01

    This study investigates the sensitivity of pension costs to changes in the underlying assumptions of a hypothetical pension plan in order to gain a perspective on the relative importance of the various actuarial assumptions via a simulation analysis. Simulation analyses are used to examine the impact of actuarial assumptions on pension costs. There are two actuarial assumptions will be considered in this study which are mortality rates and interest rates. To calculate pension costs, Accrued Benefit Cost Method, constant amount (CA) modification, constant percentage of salary (CS) modification are used in the study. The mortality assumptions and the implied mortality experience of the plan can potentially have a significant impact on pension costs. While for interest rate assumptions, it is inversely related to the pension costs. Results of the study have important implications for analyst of pension costs.

  2. Actuarial Valuation.

    Science.gov (United States)

    Teachers Retirement System of Louisiana, Baton Rouge.

    This report presents the results of the actuarial valuation of assets and liabilities as well as funding requirements for the Teachers Retirement System of Louisiana as of June 30, 1996. Data reported include current funding, actuarial assets and valuation assets. These include the Louisiana State University Agriculture and Extension Service Fund,…

  3. Actuarial Studies

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Office of the Actuary in the Centers for Medicare and Medicaid Services (CMS) from time to time conducts studies on various aspects of the Medicare and Medicaid...

  4. An analysis of a three-factor model proposed by the Danish Society of Actuaries for forecasting and risk analysis

    DEFF Research Database (Denmark)

    Jørgensen, Peter Løchte; Slipsager, Søren Kærgaard

    2015-01-01

    This paper provides the explicit solution to the three-factor diffusion model recently proposed by the Danish Society of Actuaries to the Danish industry of life insurance and pensions. The solution is obtained by use of the known general solution to multidimensional linear stochastic differential...... well-known risk measures under both schemes. Finally, we conduct a sensitivity analysis and find that the relative performance of the two schemes depends on the chosen model parameter estimates....

  5. Health insurance basic actuarial models

    CERN Document Server

    Pitacco, Ermanno

    2014-01-01

    Health Insurance aims at filling a gap in actuarial literature, attempting to solve the frequent misunderstanding in regards to both the purpose and the contents of health insurance products (and ‘protection products’, more generally) on the one hand, and the relevant actuarial structures on the other. In order to cover the basic principles regarding health insurance techniques, the first few chapters in this book are mainly devoted to the need for health insurance and a description of insurance products in this area (sickness insurance, accident insurance, critical illness covers, income protection, long-term care insurance, health-related benefits as riders to life insurance policies). An introduction to general actuarial and risk-management issues follows. Basic actuarial models are presented for sickness insurance and income protection (i.e. disability annuities). Several numerical examples help the reader understand the main features of pricing and reserving in the health insurance area. A short int...

  6. An actuarial approach to motor insurance rating

    OpenAIRE

    Coutts, S.M.

    1983-01-01

    This thesis describes an actuarial structure for the practical analysis of motor insurance premium rating. An underlying theme emphasises that judgements are being made taking into account many factors e.g. economical, statistical and technical, therefore it is necessary to bring into the decision process a group of interested persons. In addition even though data are used to explain the proposed methods, it is the framework which is important and not the omission of some of the data e.g. imp...

  7. How Fair Is Actuarial Fairness?

    DEFF Research Database (Denmark)

    Landes, Xavier

    2015-01-01

    Insurance is pervasive in many social settings. As a cooperative device based on risk pooling, it serves to attenuate the adverse consequences of various risks (health, unemployment, natural catastrophes and so forth) by offering policyholders coverage against the losses implied by adverse events...... in exchange for the payment of premi- ums. In the insurance industry, the concept of actuarial fairness serves to establish what could be adequate, fair premiums. Accordingly, premiums paid by policyholders should match as closely as possible their risk exposure (i.e. their expected losses). Such premiums...... are the product of the probabilities of losses and the expected losses. This article presents a discussion of the fairness of actuarial fairness through three steps: (1) defining the concept based on its formulation within the insurance industry; (2) determining in which sense it may be about fairness; and (3...

  8. Mathematical and statistical methods for actuarial sciences and finance

    CERN Document Server

    Pizzi, Claudio

    2014-01-01

    The interaction between mathematicians and statisticians has been shown to be an effective approach for dealing with actuarial, insurance and financial problems, both from an academic perspective and from an operative one. The collection of original papers presented in this volume pursues precisely this purpose. It covers a wide variety of subjects in actuarial, insurance and finance fields, all treated in the light of the successful cooperation between the above two quantitative approaches. The papers published in this volume present theoretical and methodological contributions and their applications to real contexts. With respect to the theoretical and methodological contributions, some of the considered areas of investigation are: actuarial models; alternative testing approaches; behavioral finance; clustering techniques; coherent and non-coherent risk measures; credit scoring approaches; data envelopment analysis; dynamic stochastic programming; financial contagion models; financial ratios; intelli...

  9. Mathematical and statistical methods for actuarial sciences and finance

    CERN Document Server

    Sibillo, Marilena

    2014-01-01

    The interaction between mathematicians and statisticians working in the actuarial and financial fields is producing numerous meaningful scientific results. This volume, comprising a series of four-page papers, gathers new ideas relating to mathematical and statistical methods in the actuarial sciences and finance. The book covers a variety of topics of interest from both theoretical and applied perspectives, including: actuarial models; alternative testing approaches; behavioral finance; clustering techniques; coherent and non-coherent risk measures; credit-scoring approaches; data envelopment analysis; dynamic stochastic programming; financial contagion models; financial ratios; intelligent financial trading systems; mixture normality approaches; Monte Carlo-based methodologies; multicriteria methods; nonlinear parameter estimation techniques; nonlinear threshold models; particle swarm optimization; performance measures; portfolio optimization; pricing methods for structured and non-structured derivatives; r...

  10. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's la

  11. Actuarial statistics with generalized linear mixed models

    NARCIS (Netherlands)

    K. Antonio; J. Beirlant

    2007-01-01

    Over the last decade the use of generalized linear models (GLMs) in actuarial statistics has received a lot of attention, starting from the actuarial illustrations in the standard text by McCullagh and Nelder [McCullagh, P., Nelder, J.A., 1989. Generalized linear models. In: Monographs on Statistics

  12. Developing an Actuarial Track Utilizing Existing Resources

    Science.gov (United States)

    Rodgers, Kathy V.; Sarol, Yalçin

    2014-01-01

    Students earning a degree in mathematics often seek information on how to apply their mathematical knowledge. One option is to follow a curriculum with an actuarial emphasis designed to prepare students as an applied mathematician in the actuarial field. By developing only two new courses and utilizing existing courses for Validation by…

  13. Modern actuarial risk theory: using R

    NARCIS (Netherlands)

    R. Kaas; M. Goovaerts; J. Dhaene; M. Denuit

    2008-01-01

    Modern Actuarial Risk Theory -- Using R contains what every actuary needs to know about non-life insurance mathematics. It starts with the standard material like utility theory, individual and collective model and basic ruin theory. Other topics are risk measures and premium principles, bonus-malus

  14. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  15. Actuarial Analysis of Social Security Pension Fund Balance%社会保障养老金平衡的精算研究

    Institute of Scientific and Technical Information of China (English)

    郭翠; 柳向东; 王蕙

    2011-01-01

    The social security pension balance problem is one of the popular topics. On the basis of part accumulation system , overall social planing and individual account, using the standard of China's current actual situation, survival theory and cash flow method , modifies the existing social security model and establishes a new social security pension balance actuarial model are studied. Furthermore, the social security pension balance and fund gap are discussed.%社会保障养老金的平衡问题是社会的一个热点问题之一.在部分积累制的基础上,分别对社会统筹账户和个人账户进行研究.以中国目前实际情况为标准,运用生存理论和现金流方法,对已有社会保障模型进行修改,建立一个新的社会保障养老金平衡的精算模型.并在该模型下,对社会保障养老金平衡和基金缺口进行了讨论.

  16. Gender and Extended Actuarial Functions in Pension Insurance

    Directory of Open Access Journals (Sweden)

    Jana Špirková

    2012-12-01

    Full Text Available This paper brings analysis of the impact of a ban on the use of gender in insurance, with special stress on pension annuity, according to the requirements of the European Court of Justice. The paper brings a state-of-theart overview of known and extended actuarial functions which relate to modeling of a premium of endowment, term life insurance and pension annuity. Moreover, the amounts of the pension annuities payable thly per year in a model of the third pillar pension are modeled and analyzed for different interest rates using life tables for both genders and unisex.

  17. Actuarial pricing of energy efficiency projects: lessons foul and fair

    International Nuclear Information System (INIS)

    Recent market convulsions in the energy industry have generated a plethora of post-mortem analyses on a wide range of issues, including accounting rules, corporate governance, commodity markets, and energy policy. While most of these analyses have focused on business practices related to wholesale energy trading, there has been limited analysis of retail energy services, particularly energy efficiency projects. We suggest that there were several business concepts and strategies in the energy efficiency arena whose inherent value may have been masked by the larger failure of companies such as Enron. In this paper, we describe one such concept, namely, actuarial pricing of energy efficiency projects, which leverages a portfolio-based approach to risk management. First, we discuss the business drivers, contrasting this approach with conventional industry practice. We then describe the implementation of this approach, including an actuarial database, pricing curves, and a pricing process compatible with commodity pricing. We conclude with a discussion of the prospects and barriers for the further development of transparent and quantifiable risk management products for energy efficiency, a prerequisite for developing energy efficiency as a tradeable commodity. We address these issues from an experiential standpoint, drawing mostly on our experience in developing and implementing such strategies at Enron

  18. Actuarial models of life insurance with stochastic interest rate

    Science.gov (United States)

    Wei, Xiang; Hu, Ping

    2009-07-01

    On the basis of general actuarial model of life insurance, this article has carried on research to continuous life insurance actuarial models under the stochastic interest rate separately. And it provide net single premium for life insurance and life annuity due over a period based on that de Moivre law of mortality and Makeham's law of mortality separately.

  19. Actuarial survival of a large Canadian cohort of preterm infants

    Directory of Open Access Journals (Sweden)

    Ohlsson Arne

    2005-11-01

    Full Text Available Abstract Background The increased survival of preterm and very low birth weight infants in recent years has been well documented but continued surveillance is required in order to monitor the effects of new therapeutic interventions. Gestation and birth weight specific survival rates most accurately reflect the outcome of perinatal care. Our aims were to determine survival to discharge for a large Canadian cohort of preterm infants admitted to the neonatal intensive care unit (NICU, and to examine the effect of gender on survival and the effect of increasing postnatal age on predicted survival. Methods Outcomes for all 19,507 infants admitted to 17 NICUs throughout Canada between January 1996 and October 1997 were collected prospectively. Babies with congenital anomalies were excluded from the study population. Gestation and birth weight specific survival for all infants with birth weight Results Survival to discharge at 24 weeks gestation was 54%, compared to 82% at 26 weeks and 95% at 30 weeks. In infants with birth weights 600–699, survival to discharge was 62%, compared to 79% at 700–799 g and 96% at 1,000–1,099 g. In infants born at 24 weeks gestational age, survival was higher in females but there were no significant gender differences above 24 weeks gestation. Actuarial analysis showed that risk of death was highest in the first 5 days. For infants born at 24 weeks gestation, estimated survival probability to 48 hours, 7 days and 4 weeks were 88 (CI 84,92%, 70 (CI 64, 76% and 60 (CI 53,66% respectively. For smaller birth weights, female survival probabilities were higher than males for the first 40 days of life. Conclusion Actuarial analysis provides useful information when counseling parents and highlights the importance of frequently revising the prediction for long term survival particularly after the first few days of life.

  20. 1st International Congress on Actuarial Science and Quantitative Finance

    CERN Document Server

    Garrido, José; Hernández-Hernández, Daniel; ICASQF

    2015-01-01

    Featuring contributions from industry and academia, this volume includes chapters covering a diverse range of theoretical and empirical aspects of actuarial science and quantitative finance, including portfolio management, derivative valuation, risk theory and the economics of insurance. Developed from the First International Congress on Actuarial Science and Quantitative Finance, held at the Universidad Nacional de Colombia in Bogotá in June 2014, this volume highlights different approaches to issues arising from industries in the Andean and Carribean regions. Contributions address topics such as Reverse mortgage schemes and urban dynamics, modeling spot price dynamics in the electricity market, and optimizing calibration and pricing with SABR models.

  1. The Role of an Actuarial Director in the Development of an Introductory Program

    Science.gov (United States)

    Staples, Susan G.

    2014-01-01

    We describe the roles and duties of a director in developing an introductory actuarial program. Degree plan design, specialized exam courses, internship classes, coordination of efforts with Economics and Finance Departments, opportunities for creating a minor in actuarial mathematics, actuarial clubs, career advice, and interaction with actuarial…

  2. 77 FR 63337 - Renewal of Charter of Advisory Committee on Actuarial Examinations

    Science.gov (United States)

    2012-10-16

    ... Joint Board for the Enrollment of Actuaries announces the renewal of the charter of the Advisory... From the Federal Register Online via the Government Publishing Office JOINT BOARD FOR THE ENROLLMENT OF ACTUARIES Renewal of Charter of Advisory Committee on Actuarial Examinations AGENCY:...

  3. Actuarial Modeling of Life Insurance Using Decrement Models

    Directory of Open Access Journals (Sweden)

    Luptáková Iveta Dirgová

    2014-07-01

    Full Text Available The aim of this paper is to elucidate decrement models and their use in actuarial calculations in life insurance. The first part deals with the most often used decrement model, the mortality table. The second part gives an example based on a simple model to illustrate the creation of a multi-valued decrement table using the data from the single-value tables for a group of decrements and their use in insurance mathematical calculations.

  4. Optimization of the Actuarial Model of Defined Contribution Pension Plan

    Directory of Open Access Journals (Sweden)

    Yan Li

    2014-01-01

    Full Text Available The paper focuses on the actuarial models of defined contribution pension plan. Through assumptions and calculations, the expected replacement ratios of three different defined contribution pension plans are compared. Specially, more significant considerable factors are put forward in the further cost and risk analyses. In order to get an assessment of current status, the paper finds a relationship between the replacement ratio and the pension investment rate using econometrics method. Based on an appropriate investment rate of 6%, an expected replacement ratio of 20% is reached.

  5. Of pacemakers and statistics: the actuarial method extended.

    Science.gov (United States)

    Dussel, J; Wolbarst, A B; Scott-Millar, R N; Obel, I W

    1980-01-01

    Pacemakers cease functioning because of either natural battery exhaustion (nbe) or component failure (cf). A study of four series of pacemakers shows that a simple extension of the actuarial method, so as to incorporate Normal statistics, makes possible a quantitative differentiation between the two modes of failure. This involves the separation of the overall failure probability density function PDF(t) into constituent parts pdfnbe(t) and pdfcf(t). The approach should allow a meaningful comparison of the characteristics of different pacemaker types. PMID:6160497

  6. 75 FR 22754 - Federal Advisory Committee; Department of Defense Board of Actuaries; Charter Renewal

    Science.gov (United States)

    2010-04-30

    ... of the Secretary Federal Advisory Committee; Department of Defense Board of Actuaries; Charter... Defense gives notice that it is renewing the charter for the Department of Defense Board of Actuaries (hereafter referred to as the Board). FOR FURTHER INFORMATION CONTACT: Jim Freeman, Deputy Advisory...

  7. 42 CFR 457.431 - Actuarial report for benchmark-equivalent coverage.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Actuarial report for benchmark-equivalent coverage... GRANTS TO STATES State Plan Requirements: Coverage and Benefits § 457.431 Actuarial report for benchmark-equivalent coverage. (a) To obtain approval for benchmark-equivalent health benefits coverage described...

  8. 42 CFR 440.340 - Actuarial report for benchmark-equivalent coverage.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Actuarial report for benchmark-equivalent coverage... AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS SERVICES: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.340 Actuarial report for benchmark-equivalent coverage....

  9. A Comparison of Logistic Regression, Neural Networks, and Classification Trees Predicting Success of Actuarial Students

    Science.gov (United States)

    Schumacher, Phyllis; Olinsky, Alan; Quinn, John; Smith, Richard

    2010-01-01

    The authors extended previous research by 2 of the authors who conducted a study designed to predict the successful completion of students enrolled in an actuarial program. They used logistic regression to determine the probability of an actuarial student graduating in the major or dropping out. They compared the results of this study with those…

  10. 75 FR 6360 - Federal Advisory Committee; DoD Medicare-Eligible Retiree Health Care Board of Actuaries

    Science.gov (United States)

    2010-02-09

    ... used in the valuation of benefits under DoD retiree health care programs for Medicare-eligible beneficiaries. Agenda --Meeting objective (Board) Approve actuarial assumptions and methods needed for...) --September 30, 2008 Actuarial Valuation Results (DoD Office of the Actuary) --September 30, 2009...

  11. Mating Reverses Actuarial Aging in Female Queensland Fruit Flies.

    Directory of Open Access Journals (Sweden)

    Sarsha Yap

    Full Text Available Animals that have a long pre-reproductive adult stage often employ mechanisms that minimize aging over this period in order to preserve reproductive lifespan. In a remarkable exception, one tephritid fruit fly exhibits substantial pre-reproductive aging but then mitigates this aging during a diet-dependent transition to the reproductive stage, after which life expectancy matches that of newly emerged flies. Here, we ascertain the role of nutrients, sexual maturation and mating in mitigation of previous aging in female Queensland fruit flies. Flies were provided one of three diets: 'sugar', 'essential', or 'yeast-sugar'. Essential diet contained sugar and micronutrients found in yeast but lacked maturation-enabling protein. At days 20 and 30, a subset of flies on the sugar diet were switched to essential or yeast-sugar diet, and some yeast-sugar fed flies were mated 10 days later. Complete mitigation of actuarial aging was only observed in flies that were switched to a yeast-sugar diet and mated, indicating that mating is key. Identifying the physiological processes associated with mating promise novel insights into repair mechanisms for aging.

  12. Modelización financiero-actuarial de un seguro de dependencia = Long Term Care Insurance Actuarial Model

    Directory of Open Access Journals (Sweden)

    Herranz Peinado, Patricia

    2008-01-01

    Full Text Available España ha seguido la tendencia de otros países en cuanto a la cobertura de las personas dependientes, es decir, aquellas que necesitan ayuda para realizar las tareas básicas de la vida diaria, y lo ha hecho mediante la aprobación de la Ley de Promoción de la Autonomía Personal y Atención a las personas en situación de Dependencia, que se basa en la financiación pública. A pesar de los esfuerzos para el desarrollo de la Ley, ésta no está dando los frutos que debiera haber dado y se hace necesaria la existencia de productos privados de cobertura que puedan atender a las necesidades de aquellos que los demandan. Dado los escasos estudios que sobre este tema existen todavía en nuestro país, el establecer una aproximación a las primas de un seguro privado de dependencia puede servir de referencia para el análisis de su comercialización por parte de las aseguradoras. En este trabajo se trata de analizar una serie de cuestiones que den respuesta a dos objetivos. Por una parte, establecer un modelo financiero-actuarial que sirva como apoyo en el diseño de productos privados que cubran la dependencia y, por otra, realizar una aproximación a las bases técnicas actuariales que lleven a la cuantificación de las primas. = Spain has followed the trend of other countries about long term care, that is, people who need help to perform the basic tasks of daily living. In December 2006, Spanish Parliament approved the law called Ley de Promoción de la Autonomía Personal y Atención a las personas en situación de Dependencia, with public financing. This law is not producing the expected results, and it makes necessary the existence of private insurance. Currently, there are few studies on long term care in our country. An approach to premiums of a private insurance can serve as reference for the studies by insurers. This work tries to study a series of questions responding simultaneously to two goals, establishing an actuarial model and, on

  13. 78 FR 773 - Hartford Financial Services Group, Inc., Commercial/Actuarial/Information Delivery Services (IDS...

    Science.gov (United States)

    2013-01-04

    .../ Information Delivery Services (IDS)/Corporate & Financial Reporting Group, Hartford, CT; Notice of Affirmative... workers of Hartford Financial Services Group, Inc., Commercial/ Actuarial/Information Delivery Services... supply of financial services. Specifically, the workers provide business and ] information...

  14. Formulating a stochastic discounting model with actuarial and risk management applications

    OpenAIRE

    Constantinos T. Artikis

    2012-01-01

    Stochastic discounting models are generally recognized as extremely strong analytical tools for a very wide variety of fundamental areas in the actuarial discipline. The paper is mainly devoted to the formulation, investigation and application in the actuarial discipline of a stochastic discounting model. It is shown that the formulated stochastic discounting model can substantially support the role of proactivity in making insurance decisions. JEL Classification: C51 Keywords: Stocha...

  15. Youth Actuarial Risk Assessment Tool (Y-ARAT): The development of an actuarial risk assessment instrument for predicting general offense recidivism on the basis of police records

    NARCIS (Netherlands)

    C.E. van der Put

    2013-01-01

    Estimating the risk for recidivism is important for many areas of the criminal justice system. In the present study, the Youth Actuarial Risk Assessment Tool (Y-ARAT) was developed for juvenile offenders based solely on police records, with the aim to estimate the risk of general recidivism among la

  16. The lifecontingencies Package: Performing Financial and Actuarial Mathematics Calculations in R

    Directory of Open Access Journals (Sweden)

    Giorgio Alfredo Spedicato

    2013-11-01

    Full Text Available It is possible to model life contingency insurances with the lifecontingencies R package, which is capable of performing financial and actuarial mathematics calculations. Its functions permit one to determine both the expected value and the stochastic distribution of insured benefits. Therefore, life insurance coverage can be priced and portfolios risk-based capital requirements can be assessed. This paper briefly summarizes the theory regarding life contingencies that is based on financial mathematics and demographic con- cepts. Then, with the aid of applied examples, it shows how the lifecontingencies package can be a useful tool for executing routine, deterministic, or stochastic calculations for life-contingencies actuarial mathematics.

  17. 76 FR 67572 - Medicare Program; Medicare Part B Monthly Actuarial Rates, Premium Rate, and Annual Deductible...

    Science.gov (United States)

    2011-11-01

    ... the Medicare Supplementary Medical Insurance (SMI) program beginning January 1, 2012. In addition... account of the SMI trust. For 2012, the total of these brand-name drug fees will be $2.8 billion. The..., rehabilitation and psychiatric hospitals, etc. TABLE 5--Actuarial Status of the Part B Account in the SMI...

  18. 75 FR 68790 - Medicare Program; Medicare Part B Monthly Actuarial Rates, Premium Rate, and Annual Deductible...

    Science.gov (United States)

    2010-11-09

    ... the Medicare Supplementary Medical Insurance (SMI) program beginning January 1, 2011. In addition... importers of brand-name prescription drugs will pay a fee that is allocated to the Part B account of the SMI...--Actuarial Status of the Part B Account in the SMI Trust Fund Under Three Sets of Assumptions for...

  19. 78 FR 64943 - Medicare Program; Medicare Part B Monthly Actuarial Rates, Premium Rate, and Annual Deductible...

    Science.gov (United States)

    2013-10-30

    ... the Medicare Supplementary Medical Insurance (SMI) program beginning January 1, 2014. In addition... have paid a fee that is allocated to the Part B account of the SMI trust. For 2014, the total of these... hospitals, etc. Table 5--Actuarial Status of the Part B Account in the SMI Trust Fund Under Three Sets...

  20. 77 FR 69850 - Medicare Program; Medicare Part B Monthly Actuarial Rates, Premium Rate, and Annual Deductible...

    Science.gov (United States)

    2012-11-21

    ... the Medicare Supplementary Medical ] Insurance (SMI) program beginning January 1, 2013. In addition... account of the SMI trust. For 2013, the total of these brand-name drug fees is estimated to be $2.7...--Actuarial Status of the Part B Account in the SMI Trust Fund Under Three Sets of Assumptions for...

  1. 5 CFR 839.1119 - How is the actuarial reduction for TSP computed?

    Science.gov (United States)

    2010-01-01

    ... computed? 839.1119 Section 839.1119 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Government contributions and earnings on those Government contributions forms the basis for the actuarial reduction. OPM will divide the Government contributions and earnings by the present value factor for...

  2. The longevity risk of the Dutch Actuarial Association’s projection model

    NARCIS (Netherlands)

    F. Peters (Frederick); W.J. Nusselder (Wilma); J.P. Mackenbach (Johan)

    2012-01-01

    markdownabstractAccurate assessment of the risk that arises from further increases in life expectancy is crucial for the financial sector, in particular for pension funds and life insurance companies. The Dutch Actuarial Association presented a revised projection model in 2010, while in the same yea

  3. Academic Attributes of College Freshmen that Lead to Success in Actuarial Studies in a Business College

    Science.gov (United States)

    Smith, Richard Manning; Schumacher, Phyllis

    2006-01-01

    The authors studied beginning undergraduate actuarial concentrators in a business college. They identified four variables (math Scholastic Aptitude Test [SAT] score, verbal SAT score, percentile rank in high school graduating class, and percentage score on a college mathematics placement exam) that were available for entering college students that…

  4. Actuarial calculation for PSAK-24 purposes post-employment benefit using market-consistent approach

    Science.gov (United States)

    Effendie, Adhitya Ronnie

    2015-12-01

    In this paper we use a market-consistent approach to calculate present value of obligation of a companies' post-employment benefit in accordance with PSAK-24 (the Indonesian accounting standard). We set some actuarial assumption such as Indonesian TMI 2011 mortality tables for mortality assumptions, accumulated salary function for wages assumption, a scaled (to mortality) disability assumption and a pre-defined turnover rate for termination assumption. For economic assumption, we use binomial tree method with estimated discount rate as its average movement. In accordance with PSAK-24, the Projected Unit Credit method has been adapted to determine the present value of obligation (actuarial liability), so we use this method with a modification in its discount function.

  5. Alice in actuarial-land: through the looking glass of changing Static-99 norms.

    Science.gov (United States)

    Sreenivasan, Shoba; Weinberger, Linda E; Frances, Allen; Cusworth-Walker, Sarah

    2010-01-01

    The Static-99, an actuarial rating method, is employed to conduct sexual violence risk assessment in legal contexts. The proponents of the Static-99 dismiss clinical judgment as not empirical. Two elements must be present to apply an actuarial risk model to a specific individual: sample representativeness and uniform measurement of outcome. This review demonstrates that both of these elements are lacking in the normative studies of the Static-99 and its revised version, the Static-99R. Studies conducted since the publication of the Static-99 have not replicated the original norms. Sexual recidivism rates for the same Static-99 score vary widely, from low to high, depending on the sample used. A hypothetical case example is presented to illustrate how the solitary application of the Static-99 or Static-99R recidivism rates to the exclusion of salient clinical factors for identifying sexual dangerousness can have serious consequences for public safety. PMID:20852227

  6. Fair value: actuarial accounting for the markets... or for the accountants?

    OpenAIRE

    Jerman, Lambert

    2013-01-01

    Fair value accounting under IAS-IFRSs is often presented as market accounting that results from expression of the financial requirements of business management and accounting practice. By showing that fair value has the features of actuarial accounting, and is the product of a conceptual shift made necessary by the contemporary context and thus in dissonance with certain aspects of current accounting practice, this article demonstrates that fair value accounting in fact represents an opportun...

  7. Actuarial modeling of cost of voluntary pension insurance of the population of the region

    Directory of Open Access Journals (Sweden)

    Mikhailova Svetlana Sergeevna

    2013-06-01

    Full Text Available In article approach to determination of net value of the contract of pension insurance for the man's and female population, considering regional demographic features is offered. Results of actuarial calculation of the size pure net - rates of individual pension insurance are presented, "sensitivity" of cost of insurance is defined by methods of statistical modeling to key parameters of a pension product for the region population.

  8. ESTIMATION OF ACTUARIAL LOSS FUNCTIONS AND THE TAIL INDEX USING TRANSFORMATIONS IN KERNEL DENSITY ESTIMATION

    OpenAIRE

    Montserrat Guillen; Jens Perch Nielsen; Catalina Bolance

    2000-01-01

    In this paper we concentrate on the estimation of loss functions using nonparametric methods. We focus on the parametric transformation approach to kernel smoothing introduced by Wand, Marron and Ruppert (1991) and compare it with the standard kernel estimator and the multiplicative bias correction method (Hjort and Glad, 1995 and Jones, Linton and Nielsen, 1995). We advocate in this paper that the transformation method behaves excellently when it comes to estimating actuarial and financial l...

  9. ActuArial Accounting – A Branch of the Financial Accounting

    OpenAIRE

    Gheorghe V. Lepadatu; Doina Maria Tilea

    2010-01-01

    The opening of the accounting to the actuarial calculation is a normal consequence of its evolutive spirit. At the origin of the international accounting standards lies the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation bases: the historical cost, the current c...

  10. El balance actuarial como indicador de la solvencia del sistema de reparto

    OpenAIRE

    María del Carmen Boado Penas; Carlos Vidal- Meliá

    2008-01-01

    The aim of this work is twofold: on the one hand, to demonstrate the utility of the actuarial balance as an element of transparency, an indicator of the solvency, sustainability and financial solidity of the pay-as-you-go system and a tool capable of providing positive incentives to improve the financial management of the system, eliminating or at least reducing the traditional divergence between the planning horizon of the politicians and that of the system itself; and on the other, to make ...

  11. THE EVOLUTION AND FUTURE OF SOCIAL SECURITY IN AFRICA: AN ACTUARIAL PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Fatima Badat

    2015-09-01

    Full Text Available Social Security in most African countries has evolved significantly in terms of perspectives, motives, governance as well as innovation of benefits and administration. African countries are slowly, one by one, beginning to reassess the role of social security in correcting several social ills. Empowerment programs and grants are increasingly being provided via social security to women and the youth. From the roots of social security, even very low income countries, some of which have recently experienced several years of civil war and extreme economic hardships, have begun to improve benefit structures and amounts, which include national medical benefits. The attention being provided to social security and how it fits into a nation’s plans to lift itself out of poverty is increasingly involving the actuarial profession from international organisations such as ILO and ISSA as well as consulting actuaries and academics. Assessing and ensuring sustainability of social security benefits requires actuarial valuations to take long-term consequences involving demographic changes into account in the face of providing the benefits in the short term; asset liability modelling to ensure adequate resources are held; ensuring that results are appropriately reported and communicated to key stakeholders; as well as developing long-term strategic plans and dynamic systems surrounding all of these issues. In this paper, the role of actuaries is brought to the centre of the increasingly changing face and evolving culture of social security in taking Africa closer to poverty alleviation. La Seguridad Social en la mayoría de los países africanos ha evolucionado significativamente en cuanto a perspectivas, motivos, gobernanza, así como en innovación en las prestaciones y la adminsitración. Los países africanos están comenzando a reevaluar el papel de la Seguridad Social en la eliminación de determinados problemas sociales. Los programas de acción y los

  12. Optimization of Actuarial Model for Individual Account of Rural Social Pension Insurance

    Institute of Scientific and Technical Information of China (English)

    Wenxian; CAO

    2013-01-01

    This paper firstly analyzes different payment methods of individual account and the pension replacement rate under the pension payment method.Results show that it will be more scientific and reasonable for the individual account of new rural social pension insurance to adopt the actuarial model of payment according to proportion of income and periodic prestation at variable amount.The Guiding Opinions on New Rural Social Pension Insurance sets forth individual account should be paid at fixed amount,and the insured voluntarily selects payment level as per criteria set by the State.The monthly calculation and distribution amount of pension is the total amount of individual account divided by139.Therefore,it should start from continuation of policies and make adjustment of payment level in accordance with growth of per capita net income of rural residents.When condition permits,it is expected to realize transition to payment as per income proportion and periodic prestation at variable amount.

  13. NEW METHOD TO OPTION PRICING FOR THE GENERAL BLACK-SCHOLES MODEL-AN ACTUARIAL APPROACH

    Institute of Scientific and Technical Information of China (English)

    闫海峰; 刘三阳

    2003-01-01

    Using physical probability measure of price process and the principle of fairpremium, the results of Mogers Bladt and Hina Hviid Rydberg are generalized. In twocases of paying intermediate divisends and no intermediate dividends, the Black-Scholesmodel is generalized to the case where the risk-less asset (bond or bank account) earns atime-dependent interest rate and risk asset (stock) has time-dependent the continuouslycompounding expected rate of return, volatility. In these cases the accurate pricing formulaand put- call parity of European option are obtained. The general approach of option pricingis given for the general Black-Scholes of the risk asset (stock) has the continuouslycompounding expected rate of return, volatility. The accurate pricing formula and put- callparity of European option on a stock whose price process is driven by general Ornstein-Uhlenback (O-U) process are given by actuarial approach.

  14. Ruin Probability and Joint Distributions of Some Actuarial Random Vectors in the Compound Pascal Model

    Institute of Scientific and Technical Information of China (English)

    Xian-min Geng; Shu-chen Wan

    2011-01-01

    The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n<T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.

  15. If You Build It, Will They Come? Tales of Developing a New Degree Program in Actuarial Science

    Science.gov (United States)

    Marano, Lisa E.

    2014-01-01

    In 2007, the B.S. in Applied Mathematics program consisting of five concentrations, including Actuarial Science, began at West Chester University of Pennsylvania, and we graduated our first class (of one) that December. We describe our program, some ideas to consider when planning your own program, and share some of the successes of our program…

  16. Accuracy of actuarial procedures for assessment of sexual offender recidivism risk may vary across ethnicity.

    Science.gov (United States)

    Långström, Niklas

    2004-04-01

    Little is known about whether the accuracy of tools for assessment of sexual offender recidivism risk holds across ethnic minority offenders. I investigated the predictive validity across ethnicity for the RRASOR and the Static-99 actuarial risk assessment procedures in a national cohort of all adult male sex offenders released from prison in Sweden 1993-1997. Subjects ordered out of Sweden upon release from prison were excluded and remaining subjects (N = 1303) divided into three subgroups based on citizenship. Eighty-three percent of the subjects were of Nordic ethnicity, and non-Nordic citizens were either of non-Nordic European (n = 49, hereafter called European) or African Asian descent (n = 128). The two tools were equally accurate among Nordic and European sexual offenders for the prediction of any sexual and any violent nonsexual recidivism. In contrast, neither measure could differentiate African Asian sexual or violent recidivists from nonrecidivists. Compared to European offenders, AfricanAsian offenders had more often sexually victimized a nonrelative or stranger, had higher Static-99 scores, were younger, more often single, and more often homeless. The results require replication, but suggest that the promising predictive validity seen with some risk assessment tools may not generalize across offender ethnicity or migration status. More speculatively, different risk factors or causal chains might be involved in the development or persistence of offending among minority or immigrant sexual abusers. PMID:15208896

  17. El tratamiento actuarial de los periodos de carencia y el contraseguro de primas en el seguro de dependencia.

    Directory of Open Access Journals (Sweden)

    Ricote Gil, Fernando.

    2003-01-01

    Full Text Available La dependencia es un estado en el que se encuentran las personas que por razones ligadas a la falta o la pérdida de autonomía física, psíquica o intelectual tienen necesidad de asistencia y/o atención significativa para la realización los actos corrientes de la vida diaria. Dentro de la iniciativa privada, el sector asegurador juega un importante papel para la prestación de garantías derivadas de esta cobertura. Un aspecto fundamental en el estudio de esta cobertura es la consideración de los periodos de carencia y el tratamiento del contraseguro de primas durante los mismo. Se analiza el riesgo actuarial y su tratamiento en esta cobertura y la incidencia en la valoración actuarial de las primas por prestaciones de dependencia.

  18. Prediction of Basic Math Course Failure Rate in the Physics, Meteorology, Mathematics, Actuarial Sciences and Pharmacy Degree Programs

    Directory of Open Access Journals (Sweden)

    Luis Rojas-Torres

    2014-09-01

    Full Text Available This paper summarizes a study conducted in 2013 with the purpose of predicting the failure rate of math courses taken by Pharmacy, Mathematics, Actuarial Science, Physics and Meteorology students at Universidad de Costa Rica (UCR. Using the Logistics Regression statistical techniques applied to the 2010 cohort, failure rates were predicted of students in the aforementioned programs in one of their Math introductory courses (Calculus 101 for Physics and Meteorology, Math Principles for Mathematics and Actuarial Science and Applied Differential Equations for Pharmacy. For these models, the UCR admission average, the student’s genre, and the average correct answers in the Quantitative Skills Test were used as predictor variables. The most important variable for all models was the Quantitative Skills Test, and the model with the highest correct classification rate was the Logistics Regression. For the estimated Physics-Meteorology, Pharmacy and Mathematics-Actuarial Science models, correct classifications were 89.8%, 73.6%, and 93.9%, respectively.

  19. 弹性退休制下退休年金的随机精算模型与模拟测算∗%The Stochastic Actuarial Models and Simulation about Retirement Annuity under Flexible Retirement System

    Institute of Scientific and Technical Information of China (English)

    孙荣

    2016-01-01

    弹性退休制是应对老龄化的社会养老保险制度改革未来选择之一,因此研究该制度下的退休年金精算问题具有重要的理论与现实意义。本文运用二项分布拟合退休年龄,带跳的非连续随机微分方程拟合利率,带跳Feller过程拟合死力强度,分析了弹性退休制下退休年金寿险精算函数,给出了生命年金、退休年金、退休年金二阶矩精算现值与均衡净保费的表达式,并利用模型对相关精算函数进行模拟测算。这些公式为弹性退休制下的保险金、均衡保险费等提供了精算分析的理论基础,从而为我国养老保险制度改革的制度设计、防控养老保险帐户风险提供了依据。%The flexible retirement system is one of the options of society endowment insurance system to deal with the aging. Therefore, the pension annuity under the system has important impact on theory and practice. This paper investigates the actuarial function of retirement annuity under the flexible retirement system. Under the hypothesis that the distribution of the retirement age is binomial distribution, the interest rate is decided by the discontinuous stochastic differential equations with jumps, and the death intensity is decided by Feller process with a jump. The formulas of the retirement annuity are proposed, including the life annuities, insurance, net premiums and insurance actuarial present value of second moment, and the sim-ulatived estimation for the relevant actuarial functions is derived. These formulas provide the basis of the actuarial analysis of insurance and balance insurance under the flexible retirement system, thus they can be applied for the old-age insurance system and control of risk pension account in China.

  20. Actuarial Assessment of Sex Offender Recidivism Risk: A Validation of the German version of the Static-991

    OpenAIRE

    Martin Rettenberger; Reinhard Eher

    2006-01-01

    The Static-99 and the RRASOR are actuarial risk assessment tools for evaluating the risk of sexual and violent recidivism in sexual offenders. The Static-99 was developed in 1999 by Karl R. Hanson (Canada) and David Thornton (Great Britain) and is in the mean time regularly used for risk assessment in North America and some countries in Europe. The RRASOR can be described as a predecessor of the Static-99 and was published by Hanson in 1997. At first we translated the revised version of the S...

  1. Properties of actuarially fair and pay-as-you-go health insurance schemes for the elderly. An OLG model approach.

    Science.gov (United States)

    Johansson, P O

    2000-07-01

    The aged dependency ratio or ADR is growing at a fast pace in many countries. This fact causes stress to the economy and might create conflicts of interest between young and old. In this paper the properties of different health insurance systems for the elderly are analysed within an overlapping generations (OLG) model. The properties of actuarial health insurance and different variations of pay-as-you-go (PAYG) health insurance are compared. It turns out that the welfare properties of these contracts are heavily dependent on the economy's dynamic properties. Of particular importance is the magnitude of the rate of population growth relative to the interest rate. In addition, it is shown that public health insurance is associated with an inherent externality resulting in a second-best solution.

  2. Recursos tecnológicos y actividades no presenciales para un mejor aprendizaje de la Estadística Actuarial

    Directory of Open Access Journals (Sweden)

    Antonio Fernández Morales

    2010-12-01

    Full Text Available En este artículo se describe el diseño y los resultados obtenidos con una estrategia didáctica innovadora para el aprendizaje de la Estadística Actuarial en la licenciatura de Ciencias Actuariales y Financieras de la Universidad de Málaga. Mediante un uso intensivo de actividades no presenciales y elementos tecnológicos que favorecen el aprendizaje en movilidad se pretende potenciar el aprendizaje autónomo, colaborativo y contextualizado. Para ello se han diseñado, entre otras, actividades de identificación y asimilación de competencias, un laboratorio virtual de supervivencia basado en un simulador gráfico interactivo y un microportal para dispositivos iPhone y iPod touch de Apple.

  3. Substance Abuse among High-Risk Sexual Offenders: Do Measures of Lifetime History of Substance Abuse Add to the Prediction of Recidivism over Actuarial Risk Assessment Instruments?

    Science.gov (United States)

    Looman, Jan; Abracen, Jeffrey

    2011-01-01

    There has been relatively little research on the degree to which measures of lifetime history of substance abuse add to the prediction of risk based on actuarial measures alone among sexual offenders. This issue is of relevance in that a history of substance abuse is related to relapse to substance using behavior. Furthermore, substance use has…

  4. ANÁLISIS ACTUARIAL DE LA INDEMNIZACIÓN POR NECESIDAD DE AYUDA DE TERCERA PERSONA ESTABLECIDA EN LA LEY 35/2015, DE 22 DE SEPTIEMBRE

    Directory of Open Access Journals (Sweden)

    Olga Gómez Pérez-Cacho

    2016-06-01

    Full Text Available La Ley 35/2015, de 22 de septiembre, establece un nuevo sistema para la valoración de los daños y perjuicios causados a las personas en accidentes de circulación, el cual incluye indemnizaciones cuya finalidad es compensar los perjuicios patrimoniales que sufren los perjudicados, ya se trate de lucro cesante o de daño emergente.Conforme a lo establecido en la propia Ley, las más relevantes de esas indemnizaciones requieren una valoración actuarial, siendo ese el caso de la indemnización por necesidad de ayuda de tercera persona. El presente trabajo tiene por objeto analizar la metodología de valoración actuarial que, para esta indemnización, se incluye en las “Bases Técnicas Actuariales del Baremo” y los resultados que se obtienen en el marco de los principios y criterios que la Ley 35/2015 establece con carácter general para el sistema y, en particular, para dicha indemnización. The Law 35/2015, September 22nd, establishes a new system for valuation of traffic accident victims damages which includes compensations of economic damages (loss of income/profit or emerging damage. According to the Law, the more relevant of these compensations require actuarial valuation. The compensation for victims requiring third party aid is one of them. In this paper, we analyze the actuarial valuation methodology that the “Bases Técnicas Actuariales del Baremo” (Scale Actuarial Technical Basis set to calculate this compensation, as well as the assumed biometric, economic and financial hypothesis and the obtained results. The framework for this is the set of principles and criteria which the Law 35/2015 establishes in general for the system and in particular for this compensation.

  5. Calculo y comparacion de la prima de un reaseguro de salud usando el modelo de opciones de Black-Scholes y el modelo actuarial

    Directory of Open Access Journals (Sweden)

    Luis Eduardo Giron

    2015-12-01

    Full Text Available La presente investigación pretende calcular y comparar la prima de un reaseguro  usando el modelo de opciones de Black-Scholes y el modelo clásico actuarial tradicional. El período de análisis va desde enero de 2011 hasta diciembre de 2012. Los resultados obtenidos muestran que el modelo de Black-Scholes, que se utiliza normalmente para valorar opciones financieras, puede ser también usado para la estimación de primas de reaseguros de salud; y que la prima neta estimada a partir de este modelo se aproxima a las establecidas por el método actuarial, excepto cuando el deducible del reaseguro es muy alto (por encima de $200.000.000.

  6. An Actuarial Model for Assessment of Prison Violence Risk Among Maximum Security Inmates

    Science.gov (United States)

    Cunningham, Mark D.; Sorensen, Jon R.; Reidy, Thomas J.

    2005-01-01

    An experimental scale for the assessment of prison violence risk among maximum security inmates was developed from a logistic regression analysis involving inmates serving parole-eligible terms of varying length (n = 1,503), life-without-parole inmates (n = 960), and death-sentenced inmates who were mainstreamed into the general prison population…

  7. Actuarial Assessment of Sex Offender Recidivism Risk: A Validation of the German version of the Static-991

    Directory of Open Access Journals (Sweden)

    Martin Rettenberger

    2006-12-01

    Full Text Available The Static-99 and the RRASOR are actuarial risk assessment tools for evaluating the risk of sexual and violent recidivism in sexual offenders. The Static-99 was developed in 1999 by Karl R. Hanson (Canada and David Thornton (Great Britain and is in the mean time regularly used for risk assessment in North America and some countries in Europe. The RRASOR can be described as a predecessor of the Static-99 and was published by Hanson in 1997. At first we translated the revised version of the Static-99 (Harris, Phenix, Hanson & Thornton, 2003 and adapted the instrument and the manual to the forensic context in Germany and Austria (Rettenberger & Eher, 2006. In this retrospective study, interrater reliability and concurrent validity of the RRASOR and of the German adaption of the Static-99 is presented. Furthermore we evaluated the predictive accuracy of the Static-99 and the RRASOR and compared their results. The instruments were validated from file information of Austrian sexuel offenders, who were convicted between 1968 and 2002. Both the Static-99 and the RRASOR had good interrater reliability and concurrent validity. The Static-99 showed good predictive validity for general (r = .41, AUC = .74, sexual (r = .35, AUC = .74 and violent (r = .41, AUC = .76 recidivism, whereas the predictive accuracy of the RRASOR was moderate for general (r = .29, AUC = .66, sexual (r = .30, AUC = .68 and violent (r = .28, AUC = .67 recidivism. The Static-99 exhibited a higher accuracy for the prediction of sexual offender recidivism. Although further validation studies on German-speaking populations of sex offenders are necessary, these results support the utility of the German version of the revised version of the Static-99 in improving risk assessment of sexual offenders.

  8. Risk allocation in a public-private catastrophe insurance system : an actuarial analysis of deductibles, stop-loss, and premiums

    NARCIS (Netherlands)

    Paudel, Y.; Botzen, W. J. W.; Aerts, J. C. J. H.; Dijkstra, T. K.

    2015-01-01

    A public-private (PP) partnership could be a viable arrangement for providing insurance coverage for catastrophe events, such as floods and earthquakes. The objective of this paper is to obtain insights into efficient and practical allocations of risk in a PP insurance system. In particular, this st

  9. La Mortalidad y la Longevidad en la Cuantificación del Riesgo Actuarial para la Población de México

    OpenAIRE

    Ornelas Vargas, Arelly

    2015-01-01

    La planificación del futuro en los ámbitos demográfico, económico y actuarial es crucial. La buena planificación en programas sociales, presupuestos de gobierno, reservas actuariales, costo de seguros y pensiones, etc., depende del uso de un buen método para realizar pronósticos. Sin embargo, los constantes cambios en tecnología, estilos de vida, cambio climático, migración, por mencionar algunos, hacen que predecir ciertos fenómenos no sea una tarea fácil. En particular, la mortalidad y ...

  10. Tablas de vida para cálculo actuarial de rentas vitalicias y retiro programado. Costa Rica circa 2000

    Directory of Open Access Journals (Sweden)

    Luis Rosero Bixby

    2004-01-01

    Full Text Available Se presentan las tablas completas de mortalidad de Costa Rica del periodo 1995- 2000 y se describe el procedimiento seguido en su estimación. Este procedimiento incluye una evaluación detallada de la información básica, especialmente de los errores censales de declaración de la edad entre los adultos mayores. Predominan los errores de exageración de la edad, los cuales inflan la población de edades avanzadas, especialmente de los 80 años en adelante. Por ejemplo, la población de 95 años y más de edad del censo está inflada en 22%. Las tablas de vida incluyen una extrapolación de la mortalidad para edades mayores de 100 años. Con una muestra de alrededor de 7 mil adultos mayores se determina que el patrón de mortalidad de los derecho-habientes de pensión es menor que el de la población general. La esperanza de vida al nacer de hombres y mujeres resultó de 74,6 y 79,4 años, respectivamente y a la edad 60 fue de 20,6 y 23,2 años, respectivamente, en toda la población de Costa Rica, y de 22,0 y 25,3 años entre los derecho-habientes de pensión. Para tomar en cuenta la disminución de la mortalidad que probablemente ocurrirá en el futuro en Costa Rica se recomienda usar la tabla de vida proyectada para 2020-25. Se seleccionó este periodo porque la esperanza de vida a la edad 65 es muy parecida a la estimada para la cohorte de nacidos en 1940, la cual se considera representativa de quienes se pensionarán en el corto y mediano plazo. Se presenta la tabla completa de 2020-25, corregida por la menor mortalidad de los derecho-habientes, para que sea utilizada en el cálculo actuarial de pensiones vitalicias y retiro programado en el periodo 2000-5. La esperanza de vida a la edad 60 en esta tabla resultó de 23,6 para los hombres y 26,8 para las mujeres, es decir, unos tres años más altas que las estimadas para la población de Costa Rica 1995-2000. Se recomienda actualizar estas estimaciones cada 5 años.

  11. Tablas de vida para cálculo actuarial de rentas vitalicias y retiro programado. Costa Rica circa 2000

    Directory of Open Access Journals (Sweden)

    Rosero-Bixby, Luis

    2004-01-01

    Full Text Available Se presentan las tablas completas de mortalidad de Costa Rica del periodo 1995-2000 y se describe el procedimiento seguido en su estimación. Este procedimiento incluye una evaluación detallada de la información básica, especialmente de los errores censales de declaración de la edad entre los adultos mayores. Predominan los errores de exageración de la edad, los cuales inflan la población de edades avanzadas, especialmente de los 80 años en adelante. Por ejemplo, la población de 95 años y más de edad del censo está inflada en 22%. Las tablas de vida incluyen una extrapolación de la mortalidad para edades mayores de 100 años. Con una muestra de alrededor de 7 mil adultos mayores se determina que el patrón de mortalidad de los derecho-habientes de pensión es menor que el de la población general. La esperanza de vida al nacer de hombres y mujeres resultó de 74,6 y 79,4 años, respectivamente y a la edad 60 fue de 20,6 y 23,2 años, respectivamente, en toda la población de Costa Rica, y de 22,0 y 25,3 años entre los derecho-habientes de pensión. Para tomar en cuenta la disminución de la mortalidad que probablemente ocurrirá en el futuro en Costa Rica se recomienda usar la tabla de vida proyectada para 2020-25. Se seleccionó este periodo porque la esperanza de vida a la edad 65 es muy parecida a la estimada para la cohorte de nacidos en 1940, la cual se considera representativa de quienes se pensionarán en el corto y mediano plazo. Se presenta la tabla completa de 2020-25, corregida por la menor mortalidad de los derecho-habientes, para que sea utilizada en el cálculo actuarial de pensiones vitalicias y retiro programado en el periodo 2000-5. La esperanza de vida a la edad 60 en esta tabla resultó de 23,6 para los hombres y 26,8 para las mujeres, es decir, unos tres años más altas que las estimadas para la población de Costa Rica 1995-2000. Se recomienda actualizar estas estimaciones cada 5 años.

  12. Research on Actuarial Models about Reform of Endowment Insurance about Institutional Organization in China%事业单位养老保险制度改革精算模型研究

    Institute of Scientific and Technical Information of China (English)

    任勇

    2015-01-01

    从中国事业单位养老保险改革实践出发,在西方寿险精算理论和精算模型基础上,分别建立事业单位养老保险制度改革过程中关于新人、中人和即将退休人员的养老保险精算模型。并分别给出关于新人、中人和即将退休人员的养老保险应由事业单位和政府财政需要补贴资金精算现值的计算公式。%Based on the practice of reform of endowment insurance about Institutional Organization in China , actuarial theory of life insurance and actuarial models in the West,actuarial models about the newcomer,men-in-service and retiring staff are set up respectively,and the actuarial present values of subsidy due to Institutional Or⁃ganization and Government fiscal revenue are given too,aiming at the purpose of providing the basis for the re-form.

  13. Homogeneous Beliefs and Mispricing of Black-Scholes Model Option Pricing Model Based on Actuarial Method%同质信念与 Black-Scholes 公式定价偏差--基于期权定价的保险精算方法∗

    Institute of Scientific and Technical Information of China (English)

    柯政; 秦梦

    2015-01-01

    本文分析了包括 BS 的鞅方法在内的四种期权定价方法.Mogens Bldt 和郑红给出的保险精算定价方法是非套利定价,缺少足够的理论基础.另外,存在同质信念的市场上 BS 定价并非完全无套利,如果对不同股票进行分散化投资,只要基础资产种类足够多,也可套取利益.不同投资者的漂移率取同一常数μ体现了他们的同质信念,与弱有效的现实市场情况相符.进一步分析得出结论,即使存在同质信念,如果μt 是一个可料过程而非常数,会使得精算定价难以计算确定期望,从而无效.根据 SAS 软件的模拟结果,在同质信念下,精算套利定价显著高于 BS 鞅方法定价.通过恒生股指期权的实证检验,说明同质信念下的漂移率更适合取同一常数而不是可料过程,实证检验发现精算套利理论价格与实际价格差距很小,说明此方法比较有效.%This paper analyzed the four option pricing methods including the martingale method.Mogens Bldt and Hong Zheng gave actuarial pricing methods for options respectively,however,both of the methods are not non-arbitrage lack ade-quate theoretical basis.In addition,BS pricing method is not completely non-arbitrage,for diversified investment at different stock,as long as a sufficient number,may make profit.Moreover,the same drift of stock for different investors embodies homogeneous beliefs,which complies with weak efficient market assumption in real market.Further analysis concludes that,e-ven if homogeneous beliefs exist,the drift of stock could be a previsible process,which results in difficulty to calculate and de-termine the expectations in actuarial pricing and thus invalids the pricing in actuarial method.According to the simulation run by SAS,with homogeneous assumption,actuarial non-arbitrage pricing gives significantly higher price for options than BS. Through empirical test of the Hang Seng index options,the authors

  14. Propuesta de formulación financiero-actuarial de un seguro de dependencia y acercamiento a una aplicación práctica

    Directory of Open Access Journals (Sweden)

    Mª Manuela Segovia

    2005-01-01

    Full Text Available En los países industrializados, convergen dos fenómenos demográficos, la longevidad y la escasa natalidad, que provocan un envejecimiento de la población y con ello una serie de procesos sociales que son necesarios atender. Uno de ellos, es la cobertura de la dependencia de las personas mayores, entendiendo por dependencia, la necesidad de ayuda para realizar las tareas básicas de la vida diaria. España se encuentra en la actualidad en plena discusión parlamentaria para regular mediante ley la dependencia y cubrir el avance de este problema que en los próximos años se verá agravado por nuestra evolución demográfica y social. Esta ley pretende establecer las bases para su atención mediante un seguro de dependencia, bien de cobertura pública por parte del Estado o bien por aseguradoras privadas. El trabajo presentado en este congreso forma parte de un estudio amplio sobre la dependencia en España y abordará una aproximación a la formulación actuarial de un seguro de dependencia partiendo del uso de cadenas de Markov para un modelo de múltiples estados, y un acercamiento a la aplicación práctica mediante el análisis de datos sobre discapacidad de nuestro país.

  15. CIMPA-UNESCO-MESR-MINECO-MOROCCO research school entitled Statistical Methods and Applications in Finance and Actuarial Science

    CERN Document Server

    Essaky, El; Vives, Josep

    2016-01-01

    This book is the outcome of the CIMPA School on Statistical Methods and Applications in Insurance and Finance, held in Marrakech and Kelaat M'gouna (Morocco) in April 2013. It presents two lectures and seven refereed papers from the school, offering the reader important insights into key topics. The first of the lectures, by Frederic Viens, addresses risk management via hedging in discrete and continuous time, while the second, by Boualem Djehiche, reviews statistical estimation methods applied to life and disability insurance. The refereed papers offer diverse perspectives and extensive discussions on subjects including optimal control, financial modeling using stochastic differential equations, pricing and hedging of financial derivatives, and sensitivity analysis. Each chapter of the volume includes a comprehensive bibliography to promote further research.

  16. Life insurance theory actuarial perspectives

    CERN Document Server

    Vylder, F Etienne

    1997-01-01

    This book is different from all other books on Life Insurance by at least one of the following characteristics 1-4. 1. The treatment of life insurances at three different levels: time-capital, present value and price level. We call time-capital any distribution of a capital over time: (*) is the time-capital with amounts Cl, ~, ... , C at moments Tl, T , ..• , T resp. N 2 N For instance, let (x) be a life at instant 0 with future lifetime X. Then the whole oO oO life insurance A is the time-capital (I,X). The whole life annuity ä is the x x time-capital (1,0) + (1,1) + (1,2) + ... + (I,'X), where 'X is the integer part ofX. The present value at 0 of time-capital (*) is the random variable T1 T TN Cl V + ~ v , + ... + CNV . (**) In particular, the present value ofA 00 and ä 00 is x x 0 0 2 A = ~ and ä = 1 + v + v + ... + v'X resp. x x The price (or premium) of a time-capital is the expectation of its present value. In particular, the price ofA 00 and äx 00 is x 2 A = E(~) and ä = E(I + v + v + ... + v'X...

  17. Actuarial considerations on genetic testing.

    Science.gov (United States)

    Le Grys, D J

    1997-08-29

    In the UK the majority of life insurers employ relatively liberal underwriting standards so that people can easily gain access to life assurance cover. Up to 95% of applicants are accepted at standard terms. If genetic testing becomes widespread then the buying habits of the public may change. Proportionately more people with a predisposition to major types of disease may take life assurance cover while people with no predisposition may take proportionately less. A model is used to show the possible effect. However, the time-scales are long and the mortality of assured people is steadily improving. The change in buying habits may result in the rate of improvement slowing down. In the whole population, the improvement in mortality is likely to continue and could improve faster if widespread genetic testing results in earlier diagnosis and treatment. Life insurers would not call for genetic tests and need not see the results of previous tests except for very large sums assured. In the UK, life insurers are unlikely to change their underwriting standards, and are extremely unlikely to bring in basic premium rating systems that give discounts on the premium or penalty points according to peoples genetic profile. The implications of widespread genetic testing on medical insurance and some health insurance covers may be more extreme.

  18. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    Science.gov (United States)

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended. PMID:25013868

  19. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    Science.gov (United States)

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  20. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Directory of Open Access Journals (Sweden)

    JeongYeon Kim

    2014-01-01

    Full Text Available This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm’s financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  1. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Science.gov (United States)

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended. PMID:25013868

  2. 一种利率双随机条件下的有序状态寿险精算模型%One Actuarial Models of Orderly Life Insurance under the Condition of Dual Random Interest Rate

    Institute of Scientific and Technical Information of China (English)

    孙荣

    2012-01-01

    有序状态的联合寿险依赖于多个被保险人的死亡顺序,与一般的联合寿险相比具有一定的复杂性.本文研究联合生命保险中有序条件的复合状态寿险精算函数,对利率采用反射Brownian运动和Poisson过程的双随机模型及死亡均匀分布假设下的一种联合投保有序条件的复合状态建模,给出了生命年金、保险金、纯保费精算现值及保险金的二阶矩的表达式.运用这些表达式可以对联合投保有序状态的保险损失风险进行分析.%Joint life insurance of orderly compound status depends on the death order of the insured persons. As compared with the general insurances, it is of more complexity to a certain extent. This paper investigates the actuarial function of joint life insurance of orderly compound status. Both reflected standard Brownian motion and Possion process are used to model the stochastic interest rates and further it is employed to construct the actuarial model. The formulas of joint life insurance of compound status are proposed, including the life annuities, insurance, actuarial present value of net premiums and second moments of insurance. Based on the formulas so proposed, the loss risk of the joint life insurance of orderly compound status can reasonably be analyzed.

  3. 随机利率下的反向抵押贷款精算定价模型研究%Actuarial Pricing Models of Reverse Mortgage with the Stochastic Interest Rate

    Institute of Scientific and Technical Information of China (English)

    贾念念; 赵雪; 杨文荟

    2015-01-01

    反向抵押贷款是应对养老问题的一个创新模式。本文为改进传统反向抵押贷款定价模型必须建立在固定利率基础之上的缺点,利用Wiener过程和负二项分布对利息力积累函数进行联合建模,构建了随机利率下的反向抵押贷款精算定价模型及死亡均匀分布(UDD)假设下随机利率反向抵押贷款精算定价模型。最后,本文运用Matlab软件对随机利率下精算定价模型进行数值测算,比较了不同参数的变化对贷款额的影响程度,得到随机利率下的反向抵押贷款精算定价模型对借款人具有一定的吸引力、贷款利率较房产价值年均升值率和房产折旧率更敏感、递增年金模型受贷款利率变动影响较小等结论。上述结论为我国反向抵押贷款的政策制定与定价提供了一定的理论依据。%The reverse mortgage is a creative measure to support the aged people. for improving shortcomings that the traditional reverse mortgage loan pricing model must be established on the basis of fixed interest rate, this paper builds the united model of accumulation function of interest force by process and negative-binomial distribution, and constructs reverse mortgages actuarial pricing model under the stochastic interest rate and the stochastic interest rate reverse actuarial pricing model under UDD assumption. Finally, this paper uses the Matlab software to change different parameters of actuarial pricing model under stochastic interest rate influence on the degree of the loan, it is concluded that the conclusion of the model is in line with the actual situation, therefore, it has important practical significance to study stochastic interest rate reverse mortgage loan pricing model.

  4. 基于随机利率和多生命体相依的联合保险精算模型%Combined InsuranceActuarial Model under Stochastic Interest Rate and Dependent Lifetime

    Institute of Scientific and Technical Information of China (English)

    赵丽霞

    2014-01-01

    By simulating mortality rate through Common Shock model and describing the term struc-ture of interest rate with Wiener process,an actuarial model for family combined insurance under stochastic interest rate and dependent life time is designed.Then,the theoretical formulas of the balanced premium are obtained,on the condition that the death happens uniformly in tail age,the approximate methods are given which are feasible in insurance practices.Finally,influences of stochastic interest rate and death rate to insurance pricing are analyzed.%采用 Common Shock 模型模拟死亡率,并用 Wiener 过程刻画利率期限结构,构建了基于随机利率和生命体相依的联合保险的纯保费精算模型。在此基础上,导出均衡年保费的理论计算公式,并在尾部年龄服从均匀分布的假设下,给出保险实务操作中可行的近似计算方法。最后,通过数值模拟分析了随机利率、死亡率对保险定价的影响。

  5. ANÁLISIS Y CONTEXTUALIZACIÓN DE LOS ASPECTOS DE ÍNDOLE ACTUARIAL Y DE SEGURIDAD SOCIAL DE LA LEY 35/2015 DE REFORMA DEL SISTEMA PARA LA VALORACIÓN DE LOS DAÑOS Y PERJUICIOS CAUSADOS A LAS PERSONAS EN ACCIDENTES DE CIRCULACIÓN

    Directory of Open Access Journals (Sweden)

    Luis María Sáez de Jáuregui Sanz

    2016-06-01

    Full Text Available En este trabajo se analizan y se contextualizan los aspectos de índole actuarial y de seguridad social recogidos en la Ley 35/2015 de reforma del sistema para la valoración de los daños y perjuicios causados a las personas en accidentes de circulación. Los preliminares de la Ley 35/2015 comienzan en 2011 con la creación de una Comisión de Expertos. A partir de ahí, se inicia un proceso que culmina, en primer lugar, en 2014 en Junta Consultiva de Seguros y Fondos de Pensiones con la entrega de un Texto articulado, unas Tablas de indemnización y unas Bases técnicas actuariales que configuran un nuevo y novedoso sistema de valoración y que termina de finalizar –tras su trámite parlamentario con prácticamente nulas modificaciones en lo fundamental– con la entrada en vigor el 1 de enero de 2016 de la Ley 35/2015, recogiéndose por primera vez en el ordenamiento jurídico español un modelo actuarial para indemnizar el lucro cesante y el daño emergente. This paper analyzes and contextualizes the actuarial and social security features of the brand new Law 35/2015 of reform for assessing the damage caused to people in road accidents. Preliminaries of the Law 35/2015 began in 2011 with the creation of a Committee of Experts. From there, a process, culminated first in 2014, in the Advisory Board of Insurance and Pension Funds with the delivery of an articulated text, tables compensation and the actuarial technical bases that form a new and novel assessment system, and ended -after the end of its parliamentary process with virtually no changes in fundamentally- with the entry into force, on January 1st of 2016, of the Law 35/2015, introducing for the first time in the Spanish legal system an actuarial model to compensate people for loss of profits and its consequential damages.

  6. Multiple Attenuation Model and Actuarial Present Value of Pension Scheme Based on CIR Interest Rate%基于CIR利率的养老金计划多元衰减模型与精算现值

    Institute of Scientific and Technical Information of China (English)

    李浩; 侯为波; 张增林

    2016-01-01

    随着我国人口老年化问题的日趋加剧,养老金计划对于老年人口的生活保障愈显得重要。寿险精算理论在科学合理制定保险产品价格方面至关重要。利率是影响价格的最重要因素之一。从长期来看,利率随时间的变化而波动,具有很强的随机波动性。文章考虑利用具有随机波动特性的CIR利率模型,建立养老金多元衰减模型,进而获得延期年金的精算现值解析式。%With the problem of astogeny getting worse,pension schemes are even more vital to the liv⁃ing security of the old. The theory of life contingencies plays an important role in setting the scientif⁃ic and reasonable prices of insurance products. The rate of interest is one of the key factors that af⁃fect prices. In the long run, the rate of interest fluctuates with time going by, enjoying strong and random volatility. To establish the multiple attenuation pattern of pension, the authors consider using the pattern of the rate of interest characterized by random and fluctuant CIR,and then gain the formu⁃la of actuarial present value of deferred annuity.

  7. Life Insurance Actuarial Model with Returnable Premium

    Institute of Scientific and Technical Information of China (English)

    YanhuaiLang

    2004-01-01

    Insurance is the important aspect of finance. It has been fully developed in the western developed countries. With the fast development of market economy in our country,it is necessary to introduce the alien modern theories and techniques, and at the same time,adapt them to our concrete situation. The model of general life insurance product is to be established in this paper, including deferred life annuities, increasing whole life insurance and returnable premiums. Then through the regulation of various parameters, we can obtain various insurance products.

  8. Actuarial risk measures for financial derivative pricing

    NARCIS (Netherlands)

    M.J. Goovaerts; R.J.A. Laeven

    2008-01-01

    We present an axiomatic characterization of price measures that are superadditive and comonotonic additive for normally distributed random variables. The price representation derived involves a probability measure transform that is closely related to the Esscher transform, and we call it the Esscher

  9. A Quasi Actuarial Prospect for Individual Assessment.

    Science.gov (United States)

    Owens, William A.

    A conceptual model of individual assessment through the use of biodata responses with minimal input information is outlined. The process is considered especially applicable to industrial psychology. A scored autobiographical data form, which measures the individual's past behavior and experiences, provides for assignment to a specific subgroup…

  10. Prognostic factors in nodular lymphomas: a multivariate analysis based on the Princess Margaret Hospital experience

    Energy Technology Data Exchange (ETDEWEB)

    Gospodarowicz, M.K.; Bush, R.S.; Brown, T.C.; Chua, T.

    1984-04-01

    A total of 1,394 patients with non-Hodgkin's lymphoma were treated at the Princess Margaret Hospital between January 1, 1967 and December 31, 1978. Overall actuarial survival of 525 patients with nodular lymphomas was 40% at 12 years; survival of patients with localized (Stage I and III) nodular lymphomas treated with radical radiation therapy was 58%. Significant prognostic factors defined by multivariate analysis included patient's age, stage, histology, tumor bulk, and presence of B symptoms. By combining prognostic factors, distinct prognostic groups have been identified within the overall population. Patients with Stage I and II disease, small or medium bulk, less than 70 years of age achieved 92% 12 year actuarial survival and a 73% relapse-free rate in 12 years of follow-up. These patients represent groups highly curable with irradiation.

  11. Policy Suggestions for Individual Account of the New Rural Old- age Insurance Based on Actuarial Models%新农保个人账户设计的改进:基于精算模型的分析

    Institute of Scientific and Technical Information of China (English)

    丁煜

    2011-01-01

    Based on the Guidelines on Launching Pilot Projects of New Rural Old - age Insurance (NROI) issued by The State Council, the paper builds up an actuarial model to evaluate individual account of the NROI. The result indicates that, 1 ) If contributions stay at fixed amounts, the individual account fund would increase regressively and even worse, the replacement rate would reduce by payment years when the payment period exceeds 30 years. 2) The individual account pension would not be enough to protect minimum living if older insurants choose low - level contributions. 3 ) Insurants would not be encouraged they' re able to afford them, under the current subsidy to choose higher level contributions, though mechanism. In views of these, the paper argues that contributions should be increased at the increase rate of Per capita net income of rural residents instead of staying at fixed amounts; Encourage older insurants to choose higher level contributions by a classifying subsides mechanism; Higher interested rates than one - year bank deposit interest rates should be calculated for individual account fund ; Insurants should have chance to choose higher "retire" age to get more pension by building a flexible receiving pension system.%根据国务院《关于开展新型农村社会养老保险试点的指导意见》,对新农保个人账户的精算结果表明:(1)采取固定额度缴费,个人账户积累基金将会累退式增长,且在缴费超过30年后,会出现缴费时间越长,替代率越低的现象;(2)大龄参保农民如果选择较低档次缴费,个人账户养老金基本起不到保障作用;(3)政府对个人账户的缴费补贴机制,不足以激励参保农民在经济条件许可的情况下,选择较高档次参保。基于此,应以固定费率取代固定额度缴费;建立缴费补贴的激励机制,引导大龄参保农民选择较高档次缴费;给予个人账户基金合理

  12. Idade de aparecimento e desaparecimento das pontas rolândicas em 160 crianças acompanhadas ambulatorialmente: estudo atuarial Age of appearance and disappearance of rolandic spikes of 160 children: an actuarial study

    Directory of Open Access Journals (Sweden)

    MOACIR A. BORGES

    1999-09-01

    Full Text Available PROPOSTA: Determinar a idade em que surgem e desaparecem as pontas rolândicas (PR em traçados eletrencefalográficos de rotina. MÉTODO: Estudo prospectivo hospitalar, baseado em 412 eletrencefalogramas de 160 crianças que frequentaram o ambulatório de neuropediatria entre as idades de 1 e 16 anos, no período de março de 1989, a março de 1998. Usou-se o sistema 10/20 para colocação dos eletrodos. As crianças foram dividas, por idade, em quatro grupos (1 a 4; 5 a 8; 9 a 16; 1 a 16 e usou-se o método de curva atuarial, considerando-se como evento, o desaparecimento da PR. RESULTADOS: PR teve distribuição entre meninos e meninas na razão de 64/36. A idade média em que surgiu a PR foi 7 anos (7,2 para homens e 6,6 para mulheres e, num pequeno grupo de crianças que tinha EEG normal anterior ao surgimento da PR, a idade média foi 6,8 anos. Após 7 anos de seguimento, a percentagem de pacientes livres de PRs era 60% para o grupo de 1 a 4 anos na admissão, 84% para os grupos de 5 a 8 anos e 9 a 16 anos. Para o grupo total (1 a 16 anos a percentagem de livres de PR, após 7 anos de seguimento, foi 78%. CONCLUSÃO: O estudo mostra que as PR surgiram, em média, aos 7 anos e têm probabilidade de desaparecimento até 7 anos de seguimento, independetemente da faixa etária de aparecimento.This study aims to determine the age at which the rolandic spikes (RS appear and disappear in routine EEGs. METHOD: It has been carried out a hospital based prospective study of 412 EEGs records of 160 children who had been assisted at the neuropediatric out-patient department during the period between March, 1989 and March,1998. Recordings were made on 8-channel instruments and 10/20 system has been to place the electrodes. The children have been divided into 4 groups, according to their age (1 to 4; 5 to 8; 9 to 16; 1 to 16, and the actuarial curve method has been used to show the disappearance of the rolandic spikes. RESULTS: RS distribution between

  13. Analytical Investigation on New Rural Social Pension Insurance System Actuarial Assumptions in West Minority Nationality Areas---Take XinJiang as an example%西部民族地区新型农村社会养老保险制度精算分析--以新疆维吾尔自治区为例

    Institute of Scientific and Technical Information of China (English)

    田园; 谭春萍

    2014-01-01

    At present,Xinjiang rural area has entered into aged society,the solution of Xinjiang new rural social pension insurance system is the important channel to solve the issues of agriculture,farmer and rural area and to fulfil equlization of public service in urban and rural areas,espexially for vast Xinjiang farmers and herdsmen whose living standards are low,elderly will be looked after properly is their urgent aspiratons. The actuarial studies the level of paying fees,deadline of paying fees and level of treatment of Xinjiang new rural social pension insurance system will consummate the rationality of its system further. Taking Xinjiang as the object of study under the background of west minority nationality areas to actuarially study the design of Xinjiang new rural social pension insurance system then put forward corresponding countermeasures and suggestions and provide intellectual support in the aspect of establish and perfect Xinjiang new rural social pension insurance system.%目前新疆维吾尔自治区农村已经进入到老龄社会,而新疆维吾尔自治区新型农村社会养老保险制度的解决是解决“三农”问题,实现城乡公共服务均等化的重要途径,特别是对于新疆维吾尔自治区广大农牧民困难群众的养老问题而言,老有所养是迫切愿望。对于新疆新型农村养老保险的缴费水平、缴费期限以及待遇水平的精算研究,将进一步完善其制度的合理性,通过以西部民族地区为背景选取新疆作为研究对象,对新疆维吾尔自治区新型农村养老保险制度的设计进行精算分析,在完善和建立新疆维吾尔自治区农村养老保险方面提供智力支持。

  14. Forecast of the Solvency of New-type Rural Social Endowment Insurance System Based on the Actuarial Model%基于精算模型的新型农村社会养老保险制度的偿付能力预测

    Institute of Scientific and Technical Information of China (English)

    李丹; 杨丽

    2011-01-01

    基于新型农村社会养老保险的基础养老金支出和个人帐户支出精算模型,对新型农村社会养老保险制度的偿付能力进行分析,确定国家和地方财政对养老保险基金的投入增长比例,指出对个人账户要制定更加有弹性的缴费方式,并且提出一系列辅助对策及措施,确保农村养老保险基金的偿付能力.%Based on the actuarial model of the fundamental annuities and individual account expenses defray, the paper analyzes the solvency of the new-type rural social endowment insurance system, specifically determines the growth rate of the state and local financial investment to the endowment insurance fund, points out that the individual account should formulate more flexible paying ways, at last puts forward a series of auxiliary countermeasures and measures, in order to ensure the solvency of rural endowment insurance fund .

  15. ACTUARIAL ESTIMATION OF TECHNICAL PROVISIONS’ ADEQUACY IN LIFE INSURANCE COMPANIES

    OpenAIRE

    Jasmina Selimovic

    2010-01-01

    When considering company doing business quality it is necessary to evaluate the amount of money that company can operate with. Insurance companies, regarding to all specifics of their business, have to evaluate all technical provisions of the company. Technical provisions, as part of the liabilities in insurers’ balance sheet, are basic measure of business operations quality and safety (i.e. they are the basic guarantee that all obligations to customers will be settled). Technical reserves ar...

  16. The Evolution of an Undergraduate Actuarial Mathematics Program

    Science.gov (United States)

    Kennedy, Kristin; Schumacher, Phyllis

    2014-01-01

    Bryant University was originally a school for business majors and offered only a few mathematics courses. After becoming accredited by the New England Association of Colleges and Universities in the 1960s, the college was required to upgrade its offerings in the area of mathematics. In the 1970s, the department offerings were increased to include…

  17. Multidimensional credibility: a Bayesian analysis of policyholders holding multiple policies

    NARCIS (Netherlands)

    K. Antonio; M. Guillén; A.M. Pérez Martín

    2010-01-01

    Property and casualty actuaries are professional experts in the economic assessment of uncertain events related to non-life insurance products (eg fire, liability or motor insurance). For the construction of a fair and reasonable tariff associated with the risks in their portfolio, actuaries have ma

  18. Multidimensional credibility: a Bayesian analysis of policyholders holding multiple contracts

    NARCIS (Netherlands)

    K. Antonio; M. Guillén; A.M. Pérez Marín

    2011-01-01

    Property and casualty actuaries are professional experts in the economic assessment of uncertain events related to non-life insurance products (e.g. fire, liability or motor insurance). For the construction of a fair and reasonable tariff associated with the risks in their portfolio, actuaries have

  19. Risk-Adjusted Analysis of Relevant Outcome Drivers for Patients after More Than Two Kidney Transplants

    Directory of Open Access Journals (Sweden)

    Lampros Kousoulas

    2015-01-01

    Full Text Available Renal transplantation is the treatment of choice for patients suffering end-stage renal disease, but as the long-term renal allograft survival is limited, most transplant recipients will face graft loss and will be considered for a retransplantation. The goal of this study was to evaluate the patient and graft survival of the 61 renal transplant recipients after second or subsequent renal transplantation, transplanted in our institution between 1990 and 2010, and to identify risk factors related to inferior outcomes. Actuarial patient survival was 98.3%, 94.8%, and 88.2% after one, three, and five years, respectively. Actuarial graft survival was 86.8%, 80%, and 78.1% after one, three, and five years, respectively. Risk-adjusted analysis revealed that only age at the time of last transplantation had a significant influence on patient survival, whereas graft survival was influenced by multiple immunological and surgical factors, such as the number of HLA mismatches, the type of immunosuppression, the number of surgical complications, need of reoperation, primary graft nonfunction, and acute rejection episodes. In conclusion, third and subsequent renal transplantation constitute a valid therapeutic option, but inferior outcomes should be expected among elderly patients, hyperimmunized recipients, and recipients with multiple operations at the site of last renal transplantation.

  20. Introduction to modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, V G

    2011-01-01

    This is an introductory-level text on stochastic modeling. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance, and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: discrete and continuous time Markov processes, Poisson processes, renewal and regenerative processes, semi-Markov processes, queueing models, and diffusion processes. The book systematically studies the short-term and the long-term behavior, cost/reward models, and first passage times. All the material is illustrated with many examples, and case studies. The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. A collection of MATLAB programs to accompany...

  1. Survivorship analysis of pedicle spinal instrumentation.

    Science.gov (United States)

    McAfee, P C; Weiland, D J; Carlow, J J

    1991-08-01

    Between 1985 and 1989, the senior author performed 120 consecutive pedicle instrumentation cases--78 Steffee (VSP) procedures and 42 procedures using Cotrel-Dubousset instrumentation. Posterolateral or posterior fusions using autogenous iliac bone graft were performed across the instrumented vertebrae in all cases. Survivorship analysis was used to calculate a predicted cumulative rate of success for this series of patients over 10 years postoperative follow-up. The criteria of failure of pedicular instrumentation or "death" of an implant were defined as 1) screw bending, 2) screw breakage, 3) infection, 4) loosening of implants, 5) any rod or plate hardware problems, or 6) removal of hardware due to a neurologic complication. Out of 526 pedicle screws (175 Cotrel-Dubousset screws, 351 VSP screws) there were 22 problem screws (22/526 = 4.18%). Six screws had bent, none were infected, 16 screws had broken, and none were loose. The 22 problem screw events occurred in 12 patients. In seven patients, the instrumentation failure was an incidental radiographic finding, in that patients had a solid posterolateral fusion. The remaining five patients had screw breakage in association with a pseudarthrosis. Life table calculations predicted the survivorship of instrumentation without complications would be 80% at 10 years postoperative follow-up. Actuarial analysis predicted the survivorship of solid posterolateral fusion at 90% at 10 years follow-up. This survivorship rate is similar to those predicted at 10 years follow-up for other more widely used orthopedic surgical implants such as total hip arthroplasty components.

  2. Analysis

    CERN Document Server

    Maurin, Krzysztof

    1980-01-01

    The extraordinarily rapid advances made in mathematics since World War II have resulted in analysis becoming an enormous organism spread­ ing in all directions. Gone for good surely are the days of the great French "courses of analysis" which embodied the whole of the "ana­ lytical" knowledge of the times in three volumes-as the classical work of Camille Jordan. Perhaps that is why present-day textbooks of anal­ ysis are disproportionately modest relative to the present state of the art. More: they have "retreated" to the state before Jordan and Goursat. In recent years the scene has been changing rapidly: Jean Dieudon­ ne is offering us his monumentel Elements d'Analyse (10 volumes) written in the spirit of the great French Course d'Analyse. To the best of my knowledge, the present book is the only one of its size: starting from scratch-from rational numbers, to be precise-it goes on to the theory of distributions, direct integrals, analysis on com­ plex manifolds, Kahler manifolds, the theory of sheave...

  3. Analysis

    Science.gov (United States)

    Abdelazeem, Maha; El-Sawy, El-Sawy K.; Gobashy, Mohamed M.

    2013-06-01

    Ar Rika fault zone constitutes one of the two major parts of the NW-SE Najd fault system (NFS), which is one of the most prominent structural features located in the east of the center of the Arabian Shield, Saudi Arabia. By using Enhancement Thematic Mapper data (ETM+) and Principle Component Analysis (PCA), surface geological characteristics, distribution of rock types, and the different trends of linear features and faults are determined in the study area. First and second order magnetic gradients of the geomagnetic field at the North East of Wadi Ar Rika have been calculated in the frequency domain to map both surface and subsurface lineaments and faults. Lineaments as deduced from previous studies, suggest an extension of the NFS beneath the cover rocks in the study area. In the present study, integration of magnetic gradients and remote sensing analysis that resulted in different valuable derivative maps confirm the subsurface extension of some of the surface features. The 3D Euler deconvolution, the total gradient, and the tilt angle maps have been utilized to determine accurately the distribution of shear zones, the tectonic implications, and the internal structures of the terranes in the Ar Rika quadrangle in three dimensions.

  4. Skull base chordomas: analysis of dose-response characteristics

    International Nuclear Information System (INIS)

    Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate

  5. 精算平衡原则下的企业职工基本养老保险改革研究──基于制度“新人”的单体分析视角%Study on the Basic Endowment Insurance System of Enterprise Employees under the Principle of Actuarial Balance-From the Perspective of“New Insured”Policy

    Institute of Scientific and Technical Information of China (English)

    盖根路; 温净; 贺志群

    2014-01-01

    There are many problems existing in the basic endowment insurance system of enterprise employees, which show no fairness nor sustainability. And if no actions are taken,the goal of basic pension’s national co-ordination would not be fully realized. With the principle of actuarial balance between the fees and pension benefits,the existing system must go through effective reform. We should insist freemasonry and incentive coexistence and reform the approaches of the original pension which is linked to employees’average wages. With the average life expectancy at retirement as a divisor,taking into account Masonic life factors,we should reform the basic pension approach which is linked to individual indexation wages. Also,using retirement and post-retirement average expectancy as a divisor,we shall improve the approach of personal accounts pension to establish a pension mechanism which connects with contributions and retirement age. Furthermore,a stable pension adjustment mechanism linked with wages’growth should also be built to achieve the goal of pension national co-ordination in a fair and sustainable system.%现行企业职工基本养老保险制度存在许多问题,不公平也不可持续,不改革实现基础养老金全国统筹。在缴费和领取养老金待遇精算平衡的原则下,必须对现行制度进行有效改革。坚持共济与激励共存,以“超额累退”算法改革原来与在岗职工平均工资挂钩的基础养老金计发办法;以退休时的平均预期余命为除数,同时考虑生命共济因素,改革与个人指数化平均缴费工资挂钩的基础养老金计发办法;以退休时和退休后每年的平均预期余命为除数,改革个人账户养老金计发办法,建立养老金与缴费贡献和退休年龄密切相关的联系机制。此外要建立与工资增长挂钩的稳定的养老金调整机制,在公平可持续的制度下实现基础养老金全国统筹。

  6. Hazard function analysis for flood planning under nonstationarity

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  7. Loss Distribution Approach (LDA: metodología actuarial aplicada al riesgo operacional Loss Distribution Approach (LDA: actuarial methodology apply to operational risk

    Directory of Open Access Journals (Sweden)

    Luis Ceferino Franco Arbeláez

    2008-07-01

    Full Text Available Este artículo es resultado de un proyecto de investigación sobre la gestión integral del riesgo operacional promovido por la Vicerrectoria de Investigaciones de la Universidad de Medellín, y cofinanciado por una firma comisionista. Se presenta una aplicación del modelo LDA, el cual se basa en la recopilación de los datos de pérdidas históricas (frecuencia y severidad, que se registran internamente en las organizaciones. Dichos datos pueden ser complementados con datos externos. Estas pérdidas son clasificadas en una matriz que relaciona las líneas de negocio de la organización y los eventos operacionales de pérdida, a partir de la cual se calcula la carga de capital. La aplicación se desarrolló para una entidad financiera. El artículo está organizado de la siguiente forma: la primera sección es introductoria al tema. En la segunda parte se presenta formalmente el modelo LDA; luego se realiza una aplicación, y en la cuarta sección se presentan algunas conclusiones.This paper is the result of a research project on integrated management of operational risk, promoted by Universidad de Medellin Research Vice-Principal's Office and co-financed by a financial company. It presents an application of the LDA model, which is based on data collection of historical losses (frequency and severity, which are recorded internally in organizations. Such data can be supplemented with external data. These losses are classified in a matrix that relates business lines of the organization and operational loss events, from which capital charge is estimated. The application was developed for a financial institution. The paper is organized as follows: The first section is introductory to the subject. The second part formally presents a LDA model; then an application is made, and in the fourth part some conclusions are presented.

  8. Five-Year Analysis of Treatment Efficacy and Cosmesis by the American Society of Breast Surgeons MammoSite Breast Brachytherapy Registry Trial in Patients Treated With Accelerated Partial Breast Irradiation

    International Nuclear Information System (INIS)

    Purpose: To present 5-year data on treatment efficacy, cosmetic results, and toxicities for patients enrolled on the American Society of Breast Surgeons MammoSite breast brachytherapy registry trial. Methods and Materials: A total of 1440 patients (1449 cases) with early-stage breast cancer receiving breast-conserving therapy were treated with the MammoSite device to deliver accelerated partial-breast irradiation (APBI) (34 Gy in 3.4-Gy fractions). Of 1449 cases, 1255 (87%) had invasive breast cancer (IBC) (median size, 10 mm) and 194 (13%) had ductal carcinoma in situ (DCIS) (median size, 8 mm). Median follow-up was 54 months. Results: Thirty-seven cases (2.6%) developed an ipsilateral breast tumor recurrence (IBTR), for a 5-year actuarial rate of 3.80% (3.86% for IBC and 3.39% for DCIS). Negative estrogen receptor status (p = 0.0011) was the only clinical, pathologic, or treatment-related variable associated with IBTR for patients with IBC and young age (<50 years; p = 0.0096) and positive margin status (p = 0.0126) in those with DCIS. The percentage of breasts with good/excellent cosmetic results at 60 months (n = 371) was 90.6%. Symptomatic breast seromas were reported in 13.0% of cases, and 2.3% developed fat necrosis. A subset analysis of the first 400 consecutive cases enrolled was performed (352 with IBC, 48 DCIS). With a median follow-up of 60.5 months, the 5-year actuarial rate of IBTR was 3.04%. Conclusion: Treatment efficacy, cosmesis, and toxicity 5 years after treatment with APBI using the MammoSite device are good and similar to those reported with other forms of APBI with similar follow-up.

  9. Actuarial Risk Assessment and Recidivism in a Sample of UK Intellectually Disabled Sexual Offenders

    Science.gov (United States)

    Wilcox, Dan; Beech, Anthony; Markall, Helena F.; Blacker, Janine

    2009-01-01

    This study examines the effectiveness of three risk assessment instruments: Static-99, Risk Matrix 2000 (RM2000) and the Rapid Risk of Sex Offender Recidivism (RRASOR), in predicting sexual recidivism among 27 intellectually disabled sex offenders. The overall sexual offence reconviction rate was 30%, while non-recidivists remained offence-free…

  10. Why not private health insurance? 2. Actuarial principles meet provider dreams.

    Science.gov (United States)

    Deber, R; Gildiner, A; Baranek, P

    1999-09-01

    What do insurers and employers feel about proposals to expand Canadian health care financing through private insurance, in either a parallel stream or a supplementary tier? The authors conducted 10 semistructured, open-ended interviews in the autumn and early winter of 1996 with representatives of the insurance industry and benefits managers working with large employers; respondents were identified using a snowball sampling technique. The respondents felt that proposals for parallel private plans within a competitive market are incompatible with insurance principles, as long as a well-functioning and relatively comprehensive public system continues to exist; the maintenance of a strong public system was both socially and economically desirable. With the exception of serving the niche market for the private management of return-to-work strategies, respondents showed little interest in providing parallel coverage. They were receptive to a larger role for supplementary insurance but cautioned that they are not willing to cover all delisted services. As business executives they stated that they are willing to insure only services and clients that will be profitable. PMID:10497614

  11. Actuarial assessment of future loss scenarios in the German insurance sector

    Science.gov (United States)

    Kubik, A.; Boehm, U.; Born, K.; Broecker, U.; Buechner, M.; Burghoff, O.; Donat, M.; Gerstengarbe, F. W.; Hattermann, F. F.; Held, H.; Kuecken, M.; Leckebusch, G. C.; Ludwig, P.; Nocke, T.; Oesterle, H.; Pardowitz, T.; Pinto, J. G.; Prahl, B. F.; Ulbrich, U.; Werner, P. C.

    2012-04-01

    The German Insurance Association (GDV) analyzed the impacts of climate change for the German insurance market. The work was conducted in cooperation with Potsdam Institute for Climate Impact Research, Freie Universität Berlin and University of Cologne. Different approaches and data sets were used to analyze the impacts of winter storm, hail and floods. High-resolution loss records to residential buildings in Germany were provided. These daily records are available on a fine spatial level of administrative districts from 1997-2007. For the period of 1984-2008 daily losses to residential buildings were derived from motor vehicle own damage insurance, which shows a surprisingly high correlation between building losses and motor vehicle losses caused by natural hazards. Loss functions from GDVs own flood risk model were made available to estimate flood losses. As climate change will progress the mean annual losses in the private residential building insurance might increase. Until 2100 losses due to winter storm could rise by more than 50%. The increase is mainly attributable to the intensification of individual exceptionally severe storms. Climate change will also result in an increase of flood losses. By the end of the century mean losses are expected to be twice as high - depending on the given scenario they could remain constant or triple. Conversely extreme events with high cumulative losses are expected to become significantly more frequent. Storms with a today's return period of 50 years might occur every 10 years at the end of the century. Floods, now returning every 50 years, could arise every 25 years. For the first time hailstorms have been analyzed. It was noticed, that in particular East Germany might be hit more frequently. Despite these findings, i.e. the cost of insurance against natural hazards might increase, the extent of such an increase in Germany should still remain within limits that can be mastered by the insurance companies. But we have to adapt to climate change. For this purpose stakeholder usually need ascertained numbers. Because our results were achieved using ensemble techniques they display per se a considerable spread. Despite this fact our results are robust over all approaches and climate models. Therefore they can be used for strategic decisions, less for daily routine business. Higher and more frequent losses will require higher venture capital and must be taken into account when implementing the EU directive Solvency II. If we assess our results carefully and act farseeing, we will be able to draw from manifold activities to deal with climate change impacts. Smart portfolio policy can help to reduce risks. Working with limits and franchises can help to insure highly exposed risks. Therefore GDV offers to his member companies wide accepted tools and risk models such as ZÜRS Geo, HQ Kumul and detailed risk statistics. After all, well-directed information policy, increased risk awareness and preventive action can reduce climate change impacts significantly.

  12. Reproductive effort accelerates actuarial senescence in wild birds : An experimental study

    NARCIS (Netherlands)

    Boonekamp, Jelle J.; Salomons, Martijn; Bouwhuis, Sandra; Dijkstra, Cornelis; Verhulst, Simon

    2014-01-01

    Optimality theories of ageing predict that the balance between reproductive effort and somatic maintenance determines the rate of ageing. Laboratory studies find that increased reproductive effort shortens lifespan, but through increased short-term mortality rather than ageing. In contrast, high fec

  13. THE EVOLUTION AND FUTURE OF SOCIAL SECURITY IN AFRICA: AN ACTUARIAL PERSPECTIVE

    OpenAIRE

    Fatima Badat; Kudzai Chigiji; Johann Söhnge; Krishen Sukdev; Natalie Van Zyl

    2015-01-01

    Social Security in most African countries has evolved significantly in terms of perspectives, motives, governance as well as innovation of benefits and administration. African countries are slowly, one by one, beginning to reassess the role of social security in correcting several social ills. Empowerment programs and grants are increasingly being provided via social security to women and the youth. From the roots of social security, even very low income countries, some of which have recently...

  14. Actuarial risk of isolated CNS involvement in Ewing's sarcoma following prophylactic cranial irradiation and intrathecal methotrexate

    International Nuclear Information System (INIS)

    Records of 154 patients with Ewing's sarcoma treated at the National Cancer Institute were reviewed to assess the incidence and risk of developing isolated central nervous system (CNS) Ewing's sarcoma. Sixty-two of the 154 patients had received CNS irradiation and intrathecal (i.t.) methotrexate as part of their initial therapy to prevent the occurrence of isolated CNS Ewing's sarcoma. The risk of developing isolate CNS Ewing's sarcoma was greatest within the first two years after diagnosis and was approximately 10%. The overall risk of CNS recurrence in the group of patients receiving DNS treatment was similar to the group receiving no therapy directed to the CNS. The occurrence of isolated CNS involvement was not prevented by the use of CNS irradiation and i.t. methotrexate. Because of a lack of efficacy to the CNS irradiation regimen, current treatment regimens do not include therapy directed to CNS

  15. 77 FR 12577 - Department of Defense (DoD) Board of Actuaries; Federal Advisory Committee Meeting

    Science.gov (United States)

    2012-03-01

    ... presentation or submit a written statement for consideration at the meeting must notify Kathleen Ludwig at (571) 372-1993, or Kathleen.Ludwig@osd.pentagon.mil , by June 15. For further information contact Ms. Ludwig... contact Kathleen Ludwig at 571-372-1993 no later than June 15, 2012. Failure to make the...

  16. 78 FR 9890 - DoD Board of Actuaries; Notice of Federal Advisory Committee Meeting

    Science.gov (United States)

    2013-02-12

    ... Room 4, Level B1, Alexandria, VA 22350. FOR FURTHER INFORMATION CONTACT: Kathleen Ludwig at the Defense... 22350-4000. Phone: (571) 372-1993, Email: Kathleen.Ludwig@osd.pentagon.mil . SUPPLEMENTARY INFORMATION... Pentagon. Those without a valid DoD Common Access Card must contact Kathleen Ludwig at 571-372-1993...

  17. Identifying Pedophiles "Eligible" for Community Notification under Megan's Law: A Multivariate Model for Actuarially Anchored Decisions.

    Science.gov (United States)

    Pallone, Nathaniel J.; Hennessy, James J.; Voelbel, Gerald T.

    1998-01-01

    A scientifically sound methodology for identifying offenders about whose presence the community should be notified is demonstrated. A stepwise multiple regression was calculated among incarcerated pedophiles (N=52) including both psychological and legal data; a precision-weighted equation produced 90.4% "true positives." This methodology can be…

  18. 成本建模与原价分析%Cost Modeling and Original Price Analysis

    Institute of Scientific and Technical Information of China (English)

    王涛; 袁建新

    2015-01-01

    根据汽车钣金件加工特点,在建立同质作业成本库的基础上,对钣金件产品原价进行数学模型构建,通过模型的运用,实现钣金件产品原价的分析与精算工作,从而发现成本控制点,最终达到节约采购成本、降低采购价格的目的。%In the light of the processing characteristics of automotive sheet metal parts, and based on the establishment of the homogeneous cost pool, we construct the mathematical model of sheet metal products’ original price.By using the models, we make the original price analysis and carry out the sheet metal products’ actuarial work, so the cost control point is found, and ultimately it is realized to save procurement costs and reduce the purchase price.

  19. The Validity and Utility of the California Family Risk Assessment under Practice Conditions in the Field: A Prospective Study

    Science.gov (United States)

    Johnson, Will L.

    2011-01-01

    Objective: Analysis of the validity and implementation of a child maltreatment actuarial risk assessment model, the California Family Risk Assessment (CFRA). Questions addressed: (1) Is there evidence of the validity of the CFRA under field operating conditions? (2) Do actuarial risk assessment results influence child welfare workers' service…

  20. CLUSTER ANALYSIS OF NATURAL DISASTER LOSSES IN POLISH AGRICULTURE

    Directory of Open Access Journals (Sweden)

    Grzegorz STRUPCZEWSKI

    2015-04-01

    Full Text Available Agricultural production risk is of special nature due to a great number of hazards, relative weakness of production entities on the market and high ambiguity which is greater than in industrial production. Natural disasters occurring very frequently, at simultaneous low percentage of insured farmers, cause damage of such sizes that force the state to organise current financial aid (for instance in the form of preferential natural disaster loans. This aid is usually not sufficient. On the other hand, regional diversity of the risk level does not positively affect the development of insurance. From the perspective of insurance companies and policymakers it becomes highly important to investigate the spatial structure of losses in agriculture caused by natural disasters. The purpose of the research is to classify the 16 Polish voivodeships into clusters in order to show differences between them according to the criterion of level of damage in agricultural farms caused by natural disasters. On the basis of the cluster analysis it was demonstrated that 11 voivodeships form quite a homogeneous group in terms of size of damage in agriculture (the value of damage in cultivations and the acreage of destroyed cultivations are two most important factors determining affiliation to the cluster, however, the profile of loss occurring in other five voivodeships has a very individual course and requires separate handling in the actuarial sense. It was also proved that high value of losses in agriculture in the absolute sense in given voivodeships do not have to mean high vulnerability of agricultural farms from these voivodeships to natural risks.

  1. Clinical outcome of patients with primary gliosarcoma treated with concomitant and adjuvant temozolomide: A single institutional analysis of 27 cases

    Directory of Open Access Journals (Sweden)

    G K Rath

    2015-01-01

    Full Text Available CONTEXT AND AIM: The prognosis of primary gliosarcoma (PGS remains dismal with current treatment modalities. We analyzed the outcome of PGS patients treated with concurrent and adjuvant temozolomide (TMZ. SETTINGS AND DESIGN: Retrospective single institutional analysis. MATERIALS AND METHODS: We retrospectively evaluated 27 patients of PGS treated with radiotherapy (RT and TMZ during 2007-2012. STATISTICAL ANALYSIS USED: Overall survival (OS was estimated by the use of Kaplan Meier method and toxicities were evaluate using common terminology criteria for adverse events version 2.0 (National Cancer Institute, USA. RESULTS: Median age at presentation and Karnofsky performance status was 45 years and 90 respectively and male: female ratio was 20:7. Patients received adjuvant RT to a total dose of 60 Gy at 2 Gy/fraction. All patients except 5 received adjuvant TMZ to a median number of 6 cycles. Grade 2 and 3 hematological toxicity was seen in 8% and 4% of patients respectively during concurrent RT. During adjuvant chemotherapy, 13.6% had Grade 3 thrombocytopenia and 9.5% had Grade 3 neutropenia. Median OS was 16.7 months (1 year and 2 year actuarial OS was 70.8% and 32.6% respectively. Adjuvant TMZ was associated with a better survival (median survival 21.21 vs. 11.93 months; P = 0.0046 on univariate analysis and also on multivariate analysis (hazard ratio 1.82, 95% confidence interval: 1.503-25.58; P = 0.012. CONCLUSIONS: The results of our study, largest series of patients with PGS treated with concurrent and adjuvant TMZ shows an impressive survival with acceptable toxicity. We suggest TMZ be included in the “standard of care” for this tumor.

  2. Manifestation Pattern of Early-Late Vaginal Morbidity After Definitive Radiation (Chemo)Therapy and Image-Guided Adaptive Brachytherapy for Locally Advanced Cervical Cancer: An Analysis From the EMBRACE Study

    Energy Technology Data Exchange (ETDEWEB)

    Kirchheiner, Kathrin, E-mail: kathrin.kirchheiner@meduniwien.ac.at [Department of Radiation Oncology, Comprehensive Cancer Center, Medical University of Vienna/General Hospital of Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna (Austria); Nout, Remi A. [Department of Clinical Oncology, Leiden University Medical Center (Netherlands); Tanderup, Kari; Lindegaard, Jacob C. [Department of Oncology, Aarhus University Hospital (Denmark); Westerveld, Henrike [Department of Radiotherapy, Academic Medical Centre, University of Amsterdam (Netherlands); Haie-Meder, Christine [Department of Radiotherapy, Gustave-Roussy, Villejuif (France); Petrič, Primož [Department of Radiotherapy, Institute of Oncology Ljubljana (Slovenia); Department of Radiotherapy, National Center for Cancer Care and Research, Doha (Qatar); Mahantshetty, Umesh [Department of Radiation Oncology, Tata Memorial Hospital, Mumbai (India); Dörr, Wolfgang; Pötter, Richard [Department of Radiation Oncology, Comprehensive Cancer Center, Medical University of Vienna/General Hospital of Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna (Austria)

    2014-05-01

    Background and Purpose: Brachytherapy in the treatment of locally advanced cervical cancer has changed substantially because of the introduction of combined intracavitary/interstitial applicators and an adaptive target concept, which is the focus of the prospective, multi-institutional EMBRACE study ( (www.embracestudy.dk)) on image-guided adaptive brachytherapy (IGABT). So far, little has been reported about the development of early to late vaginal morbidity in the frame of IGABT. Therefore, the aim of the present EMBRACE analysis was to evaluate the manifestation pattern of vaginal morbidity during the first 2 years of follow-up. Methods and Materials: In total, 588 patients with a median follow-up time of 15 months and information on vaginal morbidity were included. Morbidity was prospectively assessed at baseline, every 3 months during the first year, and every 6 months in the second year according to the Common Terminology Criteria for Adverse Events, version 3, regarding vaginal stenosis, dryness, mucositis, bleeding, fistula, and other symptoms. Crude incidence rates, actuarial probabilities, and prevalence rates were analyzed. Results: At 2 years, the actuarial probability of severe vaginal morbidity (grade ≥3) was 3.6%. However, mild and moderate vaginal symptoms were still pronounced (grade ≥1, 89%; grade ≥2, 29%), of which the majority developed within 6 months. Stenosis was most frequently observed, followed by vaginal dryness. Vaginal bleeding and mucositis were mainly mild and infrequently reported. Conclusion: Severe vaginal morbidity within the first 2 years after definitive radiation (chemo)therapy including IGABT with intracavitary/interstitial techniques for locally advanced cervical cancer is limited and is significantly less than has been reported from earlier studies. Thus, the new adaptive target concept seems to be a safe treatment with regard to the vagina being an organ at risk. However, mild to moderate vaginal morbidity

  3. Construction and Validation of Risk Assessments in a Six-Year Follow-Up of Forensic Patients: A Tridimensional Analysis.

    Science.gov (United States)

    Menzies, Robert; Webster, Christopher D.

    1995-01-01

    Evaluations of the risk of recurrent violence were conducted for 162 Canadian mentally disordered criminal defendants through the assembly of actuarial data, scores from special-to-purpose psychometric instruments, and scaled global predictions of dangerousness to others. Professional clinicians were not more accurate than nonclinical raters. (JPS)

  4. Risk Assessments by Female Victims of Intimate Partner Violence: Predictors of Risk Perceptions and Comparison to an Actuarial Measure

    Science.gov (United States)

    Connor-Smith, Jennifer K.; Henning, Kris; Moore, Stephanie; Holdford, Robert

    2011-01-01

    Recent studies support the validity of both structured risk assessment tools and victim perceptions as predictors of risk for repeat intimate partner violence (IPV). Combining structured risk assessments and victim risk assessments leads to better predictions of repeat violence than either alone, suggesting that the two forms of assessment provide…

  5. 26 CFR 1.412(c)(2)-1 - Valuation of plan assets; reasonable actuarial valuation methods.

    Science.gov (United States)

    2010-04-01

    ...) produce a “smoothing” effect. Thus, investment performance, including appreciation or depreciation in the... including appreciation and depreciation experienced by the plan during that period. However, the method... year, in addition to any subsequent reports. (4) Effect of change of asset valuation method. A...

  6. 76 FR 49569 - Use of Actuarial Tables in Valuing Annuities, Interests for Life or Terms of Years, and Remainder...

    Science.gov (United States)

    2011-08-10

    ..., Sec. 20.2031-7A) and on the assumption that the property depreciates on a straight-line basis over its...: Background On May 7, 2009, the IRS published in the Federal Register (74 FR 21438 and 74 FR 21519) final and... property for contributions made after July 31, 1969. * * * * * (b) * * * (2) Computation of...

  7. 76 FR 18649 - Technical Revisions to Actuarial Information on Form 5500 Annual Return/Report for Pension Plans...

    Science.gov (United States)

    2011-04-05

    ... Form 5500 Annual Return/Report for Pension Plans Electing Funding Alternatives Under Pension Relief Act... defined benefit pension plans under the Preservation of Access to Care for Medicare Beneficiaries and Pension Relief Act of 2010 (Pension Relief Act). The information that would be required either by way...

  8. 77 FR 12577 - Department of Defense (DoD) Medicare-Eligible Retiree Health Care Board of Actuaries; Federal...

    Science.gov (United States)

    2012-03-01

    ... written statement for consideration at the meeting, must notify Kathleen Ludwig at (571) 372-1993, or Kathleen.Ludwig@osd.pentagon.mil , by June 29, 2012. For further information contact Ms. Ludwig at the... Kathleen Ludwig at 571-372-1993 no later than June 29, 2012. Failure to make the necessary...

  9. 78 FR 9890 - DoD Medicare-Eligible Retiree Health Care Board of Actuaries; Notice of Federal Advisory...

    Science.gov (United States)

    2013-02-12

    ... INFORMATION CONTACT: For further information contact Kathleen Ludwig at the Defense Human Resource Activity...: (571) 372-1993, Email: Kathleen.Ludwig@osd.pentagon.mil . SUPPLEMENTARY INFORMATION: ] Purpose of the... of the Pentagon. Those without a valid DoD Common Access Card must contact Kathleen Ludwig at...

  10. 5 CFR 839.1118 - Will my annuity be actuarially reduced because I had Government contributions in my TSP account?

    Science.gov (United States)

    2010-01-01

    ... Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) CORRECTION OF... FERCCA are allowed to keep the Government contributions, and earnings on the Government contributions...

  11. The Largest Known Survival Analysis of Patients with Brain Metastasis from Thyroid Cancer Based on Prognostic Groups.

    Directory of Open Access Journals (Sweden)

    Jinhyun Choi

    Full Text Available To analyze the clinical features and prognostic factors associated with the survival of patients with a very rare occurrence of brain metastasis (BM from differentiated thyroid cancer (DTC.A total of 37 patients with DTC who were diagnosed with BM between 1995 and 2014 were included. We reviewed the clinical characteristics, treatment modalities, and image findings of BM. Factors associated with survival were evaluated, and the patients were divided into three prognostic groups (Groups A, B, and C for comparative analysis.The median age at BM was 63 years, and the median time from initial thyroid cancer diagnosis to BM was 3.8 years. The median survival and the 1-year actuarial survival rate after BM were 8.8 months and 47%, respectively. According to univariate and multivariate analyses, four good prognostic factors (GPFs were identified including age ≤ 60 years, PS ≤ ECOG 2, ≤ 3 BM sites, and without extracranial metastasis prior to BM. Three prognostic groups were designed based on age and number of remaining GPFs: patients ≤ 60 years of age with at least 2 GPFs (Group A had the most favorable prognosis with a median survival of 32.8 months; patients ≤ 60 years of age with fewer than 2 GPFs and those > 60 years of age with at least 2 GPFs (Group B had an intermediate prognosis with a median survival of 9.4 months; and patients > 60 years of age with fewer than 2 GPFs (Group C had the least favorable prognosis with a median survival of 1.5 months.The survival of patients with BM form DTC differed among the prognostic groups based on the total number of good prognostic factors.

  12. Whole lot of parts: stress in extreme environments.

    Science.gov (United States)

    Steel, G Daniel

    2005-06-01

    Stress has been a central interest for researchers of human behavior in extreme and unusual environments and also for those who are responsible for planning and carrying out expeditions involving such environments. This paper compares the actuarial and case study methods for predicting reactions to stress. Actuarial studies are useful, but do not tap enough variables to allow us to predict how a specific individual will cope with the rigors of an individual mission. Case histories provide a wealth of detail, but few investigators understand the challenges of properly applying this method. This study reviews some of the strengths and weaknesses of the actuarial and case history methods, and presents a four celled taxonomy of stress based on method (actuarial and case history) and effects (distress and eustress). For both research and operational purposes, the person, the setting, and time should not be considered independently; rather, it is an amalgam of these variables that provides the proper basis of analysis.

  13. The Influence of Total Nodes Examined, Number of Positive Nodes, and Lymph Node Ratio on Survival After Surgical Resection and Adjuvant Chemoradiation for Pancreatic Cancer: A Secondary Analysis of RTOG 9704

    Energy Technology Data Exchange (ETDEWEB)

    Showalter, Timothy N. [Department of Radiation Oncology, Jefferson Medical College, Thomas Jefferson University, Philadelphia, PA (United States); Winter, Kathryn A. [Radiation Therapy Oncology Group, RTOG Statistical Center, Philadelphia, PA (United States); Berger, Adam C., E-mail: adam.berger@jefferson.edu [Department of Surgery, Jefferson Medical College, Thomas Jefferson University, Philadelphia, PA (United States); Regine, William F. [Department of Radiation Oncology, University of Maryland Medical Center, Baltimore, MD (United States); Abrams, Ross A. [Department of Radiation Oncology, Rush University Medical Center, Chicago, IL (United States); Safran, Howard [Department of Medicine, Miriam Hospital, Brown University Oncology Group, Providence, RI (United States); Hoffman, John P. [Department of Surgical Oncology, Fox Chase Cancer Center, Philadelphia, PA (United States); Benson, Al B. [Division of Hematology-Oncology, Northwestern University, Chicago, IL (United States); MacDonald, John S. [St. Vincent' s Cancer Care Center, New York, NY (United States); Willett, Christopher G. [Department of Radiation Oncology, Duke University Medical Center, Durham, NC (United States)

    2011-12-01

    Purpose: Lymph node status is an important predictor of survival in pancreatic cancer. We performed a secondary analysis of Radiation Therapy Oncology Group (RTOG) 9704, an adjuvant chemotherapy and chemoradiation trial, to determine the influence of lymph node factors-number of positive nodes (NPN), total nodes examined (TNE), and lymph node ratio (LNR ratio of NPN to TNE)-on OS and disease-free survival (DFS). Patient and Methods: Eligible patients from RTOG 9704 form the basis of this secondary analysis of lymph node parameters. Actuarial estimates for OS and DFS were calculated using Kaplan-Meier methods. Cox proportional hazards models were performed to evaluate associations of NPN, TNE, and LNR with OS and DFS. Multivariate Cox proportional hazards models were also performed. Results: There were 538 patients enrolled in the RTOG 9704 trial. Of these, 445 patients were eligible with lymph nodes removed. Overall median NPN was 1 (min-max, 0-18). Increased NPN was associated with worse OS (HR = 1.06, p = 0.001) and DFS (HR = 1.05, p = 0.01). In multivariate analyses, both NPN and TNE were associated with OS and DFS. TNE > 12, and >15 were associated with increased OS for all patients, but not for node-negative patients (n = 142). Increased LNR was associated with worse OS (HR = 1.01, p < 0.0001) and DFS (HR = 1.006, p = 0.002). Conclusion: In patients who undergo surgical resection followed by adjuvant chemoradiation, TNE, NPN, and LNR are associated with OS and DFS. This secondary analysis of a prospective, cooperative group trial supports the influence of these lymph node parameters on outcomes after surgery and adjuvant therapy using contemporary techniques.

  14. The influence of total nodes examined, number of positive nodes, and lymph node ratio on survival after surgical resection and adjuvant chemoradiation for pancreatic cancer: A secondary analysis of RTOG 9704

    Science.gov (United States)

    Showalter, Timothy N.; Winter, Kathryn A.; Berger, Adam C.; Regine, William F.; Abrams, Ross A.; Safran, Howard; Hoffman, John P.; Benson, Al B.; MacDonald, John S.; Willett, Christopher G.

    2010-01-01

    Purpose Lymph node status is an important predictor of survival in pancreatic cancer. We performed a secondary analysis of RTOG 9704, an adjuvant chemotherapy and chemoradiation trial, to determine the influence of lymph node factors-number of positive nodes (NPN), total nodes examined (TNE), and lymph node ratio (LNR-ratio of NPN to TNE)-on OS and disease-free survival (DFS). Patient and Methods Eligible patients from RTOG 9704 form the basis of this secondary analysis of lymph node parameters. Actuarial estimates for OS and DFS were calculated using Kaplan-Meier methods. Cox proportional hazards models were performed to evaluate associations of NPN, TNE, and LNR with OS and DFS. Multivariate Cox proportional hazards models were also performed. Results There were 538 patients enrolled in the RTOG 9704 trial. Of these, 445 patients were eligible with lymph nodes removed. Overall median NPN was 1 (min-max, 0-18). Increased NPN was associated with worse OS (HR=1.06, p=0.001) and DFS (HR=1.05, p=0.01). In multivariate analyses, both NPN and TNE were associated with OS and DFS. TNE > 12, and >15, were associated with increased OS for all patients, but not for node-negative patients (n =142). Increased LNR was associated with worse OS (HR=1.01, p<0.0001) and DFS (HR=1.006, p=0.002). Conclusion In patients who undergo surgical resection followed by adjuvant chemoradiation, TNE, NPN, and LNR are associated with OS and DFS. This secondary analysis of a prospective, cooperative group trial supports the influence of these lymph node parameters on outcomes after surgery and adjuvant therapy using contemporary techniques. PMID:20934270

  15. Analysis of Pension Schemes Disclosures of UK - FTSE 100 Companies 2005-2006

    OpenAIRE

    Sun, Yulu

    2007-01-01

    This study aims to examine the level and quality of disclosures with regard to pension obligations, with particular focus on principal actuarial assumptions among the FTSE 100 companies under IFRS and UK GAAP in 2005 and 2006. The increasing risks behind the pension liabilities have highlighted the need for more informative and accurate pension disclosures. Investors need more information on pensions in order to understand the underlying risks and adjust their investment decisions accordi...

  16. Long-term local control achieved after hypofractionated stereotactic body radiotherapy for adrenal gland metastases: A retrospective analysis of 34 patients

    Energy Technology Data Exchange (ETDEWEB)

    Scorsetti, Marta; Alongi, Filippo [Radiotherapy and Radiosurgery Dept., IRCCS Istituto Clinico Humanitas, Humanitas Cancer Center, Rozzano, Milano (Italy)], Email: filippo.alongi@humanitas.it; Filippi, Andrea Riccardo [Radiation Oncology Unit, Dept. of Medical and Surgical Sciences, Univ. of Turin, Turin (Italy)] [and others

    2012-05-15

    Aims and background. To describe feasibility, tolerability and clinical outcomes of stereotactic body radiation therapy (SBRT) in the treatment of adrenal metastases in 34 consecutive cancer patients. Material and methods. Between March 2004 and July 2010, a total of 34 consecutive patients, accounting for 36 adrenal metastatic lesions, were treated with SBRT. SBRT treatments were delivered by a Linac Varian 600 with microMLC (3DLine, Elekta, Stockholm, Sweden) and a Linac ELEKTA Precise (Elekta). All 34 patients were clinically and radiologically evaluated during and after completion of SBRT. Following outcomes were taken into account: best clinical response at any time, local control, time to systemic progression, time to local progression, overall survival and toxicity. Survival was estimated by the Kaplan-Meier method and factor potentially affecting outcomes were analyzed with Cox regression analysis. Results. Total RT doses ranged from 20 Gy in 4 fractions to 45 Gy in 18 fractions (median dose: 32 Gy; median number of fractions: 4). All doses were prescribed to the 95% isodose line. No cases of Grade {>=} 3 toxicity were recorded. At a median follow-up time of 41 months (range, 12-75) 22 patients were alive. Three of 28 lesions (11%) showed complete response, 13/28 (46%) partial response, 10/28 (36%) stable disease and 2/28 (7%) progressed in the treated area. Local failure was observed in 13 cases. Actuarial local control rates at one and two years were 66% and 32%, respectively. Median time to local progression was 19 months. Median survival was 22 months. Conclusion. SBRT in adrenal gland metastasis is feasible without significant acute and late toxicities, with a good rate of local control. New SBRT fractionation schemes and the possibility to combine new systemic approaches should be investigated in order to further increase local control and reduce systemic disease progression.

  17. Outcomes for Spine Stereotactic Body Radiation Therapy and an Analysis of Predictors of Local Recurrence

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Andrew J.; Tao, Randa [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Rebueno, Neal C. [Department of Radiation Dosimetry, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Christensen, Eva N.; Allen, Pamela K. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Wang, Xin A. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Amini, Behrang [Department of Diagnostic Radiology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tannir, Nizar M. [Department of Genitourinary Medical Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tatsui, Claudio E.; Rhines, Laurence D. [Department of Neurosurgery, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, Jing [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Chang, Eric L. [Department of Radiation Oncology, USC Norris Cancer Hospital, Keck School of Medicine of USC, Los Angeles, California (United States); Brown, Paul D. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Ghia, Amol J., E-mail: ajghia@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2015-08-01

    Purpose: To investigate local control, survival outcomes, and predictors of local relapse for patients treated with spine stereotactic body radiation therapy. Methods and Materials: We reviewed the records of 332 spinal metastases consecutively treated with stereotactic body radiation therapy between 2002 and 2012. The median follow-up for all living patients was 33 months (range, 0-111 months). Endpoints were overall survival and local control (LC); recurrences were classified as either in-field or marginal. Results: The 1-year actuarial LC and overall survival rates were 88% and 64%, respectively. Patients with local relapses had poorer dosimetric coverage of the gross tumor volume (GTV) compared with patients without recurrence (minimum dose [Dmin] biologically equivalent dose [BED] 23.9 vs 35.1 Gy, P<.001; D98 BED 41.8 vs 48.1 Gy, P=.001; D95 BED 47.2 vs 50.5 Gy, P=.004). Furthermore, patients with marginal recurrences had poorer prescription coverage of the GTV (86% vs 93%, P=.01) compared with those with in-field recurrences, potentially because of more upfront spinal canal disease (78% vs 24%, P=.001). Using a Cox regression univariate analysis, patients with a GTV BED Dmin ≥33.4 Gy (median dose) (equivalent to 14 Gy in 1 fraction) had a significantly higher 1-year LC rate (94% vs 80%, P=.001) compared with patients with a lower GTV BED Dmin; this factor was the only significant variable on multivariate Cox analysis associated with LC (P=.001, hazard ratio 0.29, 95% confidence interval 0.14-0.60) and also was the only variable significant in a separate competing risk multivariate model (P=.001, hazard ratio 0.30, 95% confidence interval 0.15-0.62). Conclusions: Stereotactic body radiation therapy offers durable control for spinal metastases, but there is a subset of patients that recur locally. Patients with local relapse had significantly poorer tumor coverage, which was likely attributable to treatment planning directives that prioritized the

  18. Instrumental analysis

    International Nuclear Information System (INIS)

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  19. Sensitivity analysis

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...

  20. Data Analysis

    OpenAIRE

    Gionis, Aristides

    2013-01-01

    The objective of this report is to highlight opportunities for enhancing global research data infrastructures from the point of view of data analysis. We discuss various directions and data-analysis functionalities for supporting such infrastructures.

  1. CSF analysis

    Science.gov (United States)

    Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...

  2. Activation analysis

    International Nuclear Information System (INIS)

    The neutron activation analysis, which appears to be in limits for further advance, is the most suitable for providing information on the principal as well as the microcomponents in any sample of solid form. Then, instrumental activation analysis is capable of determination of far many elements in various samples. Principally on the neutron activation analysis, the following are described in literature survey from 1982 to middle 1984: bibliography, review, data collection, etc.; problems in spectral analysis and measurement; activation analysis with neutrons; charged particle and photo-nucleus reactions; chemical separation, isotopic dilution activation analysis; molecular activation analysis; standard materials; life and its relation samples; environmental, food, court trial and archaeological samples; space and earth sciences. (Mori, K.)

  3. Strategic analysis

    OpenAIRE

    Chládek, Vítězslav

    2012-01-01

    The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...

  4. Strategic analysis

    OpenAIRE

    Bartuňková, Alena

    2008-01-01

    The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...

  5. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  6. Scenario analysis

    NARCIS (Netherlands)

    Li, L.; Braat, L.C.; Lei, G.; Arets, E.J.M.M.; Liu, J.; Jiang, L.; Fan, Z.; Liu, W.; He, H.; Sun, X.

    2014-01-01

    This chapter presents the results of the scenario analysis of China’s ecosystems focusing on forest, grassland, and wetland ecosystems. The analysis was undertaken using Conversion of Land Use Change and its Effects (CLUE) modeling and an ecosystem service matrix (as explained below) complemented by

  7. Recursive analysis

    CERN Document Server

    Goodstein, R L

    2010-01-01

    Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and

  8. Numerical analysis

    CERN Document Server

    Khabaza, I M

    1960-01-01

    Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput

  9. Panel Analysis

    DEFF Research Database (Denmark)

    Brænder, Morten; Andersen, Lotte Bøgh

    2014-01-01

    Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...

  10. Dimensional Analysis

    CERN Document Server

    Tan, Qingming

    2011-01-01

    Dimensional analysis is an essential scientific method and a powerful tool for solving problems in physics and engineering. This book starts by introducing the Pi Theorem, which is the theoretical foundation of dimensional analysis. It also provides ample and detailed examples of how dimensional analysis is applied to solving problems in various branches of mechanics. The book covers the extensive findings on explosion mechanics and impact dynamics contributed by the author's research group over the past forty years at the Chinese Academy of Sciences. The book is intended for advanced undergra

  11. Conversation Analysis.

    Science.gov (United States)

    Schiffrin, Deborah

    1990-01-01

    Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…

  12. Biorefinery Analysis

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.

  13. Analysis I

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  14. Analysis II

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  15. Link Analysis

    Science.gov (United States)

    Donoho, Steve

    Link analysis is a collection of techniques that operate on data that can be represented as nodes and links. This chapter surveys a variety of techniques including subgraph matching, finding cliques and K-plexes, maximizing spread of influence, visualization, finding hubs and authorities, and combining with traditional techniques (classification, clustering, etc). It also surveys applications including social network analysis, viral marketing, Internet search, fraud detection, and crime prevention.

  16. Business analysis

    CERN Document Server

    Paul, Debra; Cadle, James

    2010-01-01

    Throughout the business world, public, private and not-for-profit organisations face huge challenges. Business analysts must respond by developing practical, creative and financially sound solutions. This excellent guide gives them the necessary tools. It supports everyone wanting to achieve university and industry qualifications in business analysis and information systems. It is particularly beneficial for those studying for ISEB qualifications in Business Analysis. Some important additions since the first edition (2006): the inclusion of new techniques such as Ishikawa diagrams and spaghe

  17. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from m

  18. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  19. Radioactivation analysis

    International Nuclear Information System (INIS)

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  20. Propuesta de formulación financiero-actuarial de un seguro de dependencia y acercamiento a una aplicación práctica

    OpenAIRE

    Mª Manuela Segovia; Flor Mª Guerrero; Patricia Herranz

    2005-01-01

    En los países industrializados, convergen dos fenómenos demográficos, la longevidad y la escasa natalidad, que provocan un envejecimiento de la población y con ello una serie de procesos sociales que son necesarios atender. Uno de ellos, es la cobertura de la dependencia de las personas mayores, entendiendo por dependencia, la necesidad de ayuda para realizar las tareas básicas de la vida diaria. España se encuentra en la actualidad en plena discusión parlamentaria para regular mediante ...

  1. EL FACTOR DE SOSTENIBILIDAD: DISEÑOS ALTERNATIVOS Y VALORACIÓN FINANCIERO - ACTUARIAL DE SUS EFECTOS SOBRE LOS PARÁMETROS DEL SISTEMA

    Directory of Open Access Journals (Sweden)

    Robert Meneu Gaya

    2013-05-01

    Full Text Available La Ley 27/2011, que reforma el sistema de pensiones español, introduce el factor de sostenibilidad, un instrumento que ajusta automáticamente los parámetros del sistema a la evolución de la esperanza de vida a partir del año 2027 y con revisiones cada 5 años, aunque la reciente Ley Orgánica 2/2012 de Estabilidad Presupuestaria y Sostenibilidad Financiera abre la posibilidad de que se anticipe su entrada en vigor si se proyecta un déficit a largo plazo en el sistema de pensiones. Dado que el diseño concreto del factor de sostenibilidad está pendiente, resulta relevante analizar cómo otros países de la Unión Europea han incorporado instrumentos similares en sus sistemas de pensiones y, desde esas experiencias, ofrecer diseños alternativos para el caso español, valorando los efectos de cada uno de ellos sobre los parámetros del sistema, utilizando para ello las recientes proyecciones de esperanza de vida del Instituto Nacional de Estadística. La Ley 27/2011, que reforma el sistema de pensiones español, introduce el factor de sostenibilidad, un instrumento que ajusta automáticamente los parámetros del sistema a la evolución de la esperanza de vida a partir del año 2027 y con revisiones cada 5 años, aunque la reciente Ley Orgánica 2/2012 de Estabilidad Presupuestaria y Sostenibilidad Financiera abre la posibilidad de que se anticipe su entrada en vigor si se proyecta un déficit a largo plazo en el sistema de pensiones. Dado que el diseño concreto del factor de sostenibilidad está pendiente, resulta relevante analizar cómo otros países de la Unión Europea han incorporado instrumentos similares en sus sistemas de pensiones y, desde esas experiencias, ofrecer diseños alternativos para el caso español, valorando los efectos de cada uno de ellos sobre los parámetros del sistema, utilizando para ello las recientes proyecciones de esperanza de vida del Instituto Nacional de Estadística.

  2. 具有储蓄功能失能收入保险的精算模型%Disability Income Insurance Actuarial Model with Savings

    Institute of Scientific and Technical Information of China (English)

    陈岱婉

    2008-01-01

    把失能收入保险作为独立保险产品的主险,根据平衡原理,利用减量表模型,特别地增加还本部分,建立了一个综合的失能收入保险精算模型,保险公司可以根据不同的参数,获得不同的失能收入保险品种.这种保险具有储蓄和保障的双重功能,对投保人来说,多了一种理想的投资选择.

  3. Actuarial Prediction of Juvenile Recidivism: The Static Variables of the Juvenile Sex Offender Assessment Protocol-II (J-SOAP-II

    Directory of Open Access Journals (Sweden)

    Amanda B. Powers-Sawyer

    2009-12-01

    Full Text Available Sexual offending among youth retains a great deal of attention from public policy decision-makers in the juvenile justice system despite the low rate at which juveniles sexually reoffend. Within the judicial system, evaluators are faced with the challenge of answering questions from the court regarding sexual recidivism potential and corresponding treatment placements. Empirical investigations link empirically supported and promising risk factors to sexual reoffending among juveniles across several studies. However, there is little cumulative evidence for strong predictive strength among risk assessment tools for juvenile sexual recidivism. We found the static variables of the Juvenile Sex Offender Assessment Protocol-II to have strong predictive accuracy for sexual recidivism (AUC=.75 among 7 sexual recidivists in a sample of 96 juvenile sex offenders. The J-SOAP-II was a poor predictor for non-sexual violent recidivism and non-sexual general recidivism. Additionally, we found that the predictive accuracy of the J-SOAP-II is stronger for the sex drive/preoccupation scale (AUC=.72 than the impulsive/anti-social behavior scale (AUC=.64. These data provide preliminary evidence for the predictive validity of the J-SOAP-II and indicate much of the predictive power is due to the measure of sexual drive rather than impulsivity.

  4. The Actuarial Model of Pension Fund balance%养老保险基金收支平衡精算模型

    Institute of Scientific and Technical Information of China (English)

    汤志浩

    2015-01-01

    In 2000, China entered the aging society ranks, then our society aging trend is increasingly serious, "Old before get-ting rich" national condition and aging population to China’s old-age insurance system brought huge impact, the reform of pension insurance system, we must first ensure that the endowment insurance fund balance, otherwise the endowment insurance system fi-nally only tends to collapse. So the study on the pension payments under the background of population aging and its relevant fac-tors, has important significance.%2000年中国步入了老年社会的行列,此后我国社会的老龄化趋势日益严重,“未富先老”的国情以及人口老龄化加剧给中国现今的养老保险制度带来了巨大冲击。改革养老保险制度,首先要保证养老保险基金的收支平衡,否则养老保险体系最终只能趋于覆灭。因此研究老龄化背景下养老金收支情况及其相关因素,具有重要意义。

  5. Real analysis

    CERN Document Server

    DiBenedetto, Emmanuele

    2016-01-01

    The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...

  6. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  7. Real analysis

    CERN Document Server

    Loeb, Peter A

    2016-01-01

    This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....

  8. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  9. Outlier analysis

    CERN Document Server

    Aggarwal, Charu C

    2013-01-01

    With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and

  10. Nonlinear analysis

    CERN Document Server

    Nanda, Sudarsan

    2013-01-01

    "Nonlinear analysis" presents recent developments in calculus in Banach space, convex sets, convex functions, best approximation, fixed point theorems, nonlinear operators, variational inequality, complementary problem and semi-inner-product spaces. Nonlinear Analysis has become important and useful in the present days because many real world problems are nonlinear, nonconvex and nonsmooth in nature. Although basic concepts have been presented here but many results presented have not appeared in any book till now. The book could be used as a text for graduate students and also it will be useful for researchers working in this field.

  11. Elementary analysis

    CERN Document Server

    Snell, K S; Langford, W J; Maxwell, E A

    1966-01-01

    Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is

  12. Diophantine analysis

    CERN Document Server

    Steuding, Jorn

    2005-01-01

    While its roots reach back to the third century, diophantine analysis continues to be an extremely active and powerful area of number theory. Many diophantine problems have simple formulations, they can be extremely difficult to attack, and many open problems and conjectures remain. Diophantine Analysis examines the theory of diophantine approximations and the theory of diophantine equations, with emphasis on interactions between these subjects. Beginning with the basic principles, the author develops his treatment around the theory of continued fractions and examines the classic theory, inclu

  13. Risk analysis

    International Nuclear Information System (INIS)

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field

  14. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  15. Image Analysis

    DEFF Research Database (Denmark)

    . The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries......The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...

  16. Cluster analysis

    CERN Document Server

    Everitt, Brian S; Leese, Morven; Stahl, Daniel

    2011-01-01

    Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons

  17. Learner Analysis

    Institute of Scientific and Technical Information of China (English)

    Song Xuexia

    2005-01-01

    @@ In the past, more attempts had been made to explore the ways for teachers to teach English but fewer for learners to learn the language. Learner analysis is to analyze "What the learner is", including age, attitude, motivation, intelligence,aptitude, personality , and etc, with the purpose to realize the transition from "teacher-centered" into "learner-oriented".

  18. Regression Analysis

    CERN Document Server

    Freund, Rudolf J; Sa, Ping

    2006-01-01

    The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design

  19. Inclusion Analysis

    CERN Document Server

    Colver, David

    2010-01-01

    Inclusion analysis is the name given by Operis to a black box testing technique that it has found to make the checking of key financial ratios calculated by spreadsheet models quicker, easier and more likely to find omission errors than code inspection.

  20. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  1. Mediation Analysis

    OpenAIRE

    David P. MacKinnon; Fairchild, Amanda J.; Fritz, Matthew S.

    2007-01-01

    Mediating variables are prominent in psychological theory and research. A mediating variable transmits the effect of an independent variable on a dependent variable. Differences between mediating variables and confounders, moderators, and covariates are outlined. Statistical methods to assess mediation and modern comprehensive approaches are described. Future directions for mediation analysis are discussed.

  2. Frontier Analysis

    DEFF Research Database (Denmark)

    Assaf, A. George; Josiassen, Alexander

    2016-01-01

    and macro applications of these approaches, summarizing and critically reviewing the characteristics of the existing studies. We also conduct a meta-analysis to create an overview of the efficiency results of frontier applications. This allows for an investigation of the impact of frontier methodology...

  3. Exergy analysis

    DEFF Research Database (Denmark)

    Dovjak, M.; Simone, Angela; Kolarik, Jakub;

    2011-01-01

    Exergy analysis enables us to make connections among processes inside the human body and processes in a building. So far, only the effect of different combinations of air temperatures and mean radiant temperatures have been studied, with constant relative humidity in experimental conditions...

  4. Genetic analysis

    NARCIS (Netherlands)

    Koornneef, M.; Alonso-Blanco, C.; Stam, P.

    2006-01-01

    The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance relation

  5. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  6. Wavelet analysis

    CERN Document Server

    Cheng, Lizhi; Luo, Yong; Chen, Bo

    2014-01-01

    This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...

  7. Mathematical analysis

    Science.gov (United States)

    Donaldson, J. A.

    1984-01-01

    Simple continuum models used in the design, analysis, and control of large space structures are examined. Particular emphasis is placed on boundary value problems associated with the Load Correction Method and control problems involving partial differential equations for the large space structure models. Partial differential equations will be used to model a large space structure, base the design of an optimal controller on this model, approximate the resulting optimal control model, and compare the results with data from other methods.

  8. Elastodynamic Analysis

    DEFF Research Database (Denmark)

    Andersen, Lars

    This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different ....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....

  9. Sensitivity analysis

    International Nuclear Information System (INIS)

    General remarks on sensitivity analysis, the study of changes in a model output produced by varying model inputs, are made first. Sampling methods are discussed, and three sensitivity measures: partial rank correlation, derivative or response surface, and partial variance are described. Some sample results for a 16-input, 13-output hydrodynamics model are given. Both agreement and disagreement were found among the sensitivity measures. 4 figures

  10. Economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.

  11. An analysis of clinical and treatment related prognostic factors on outcome using biochemical control as an end-point in patients with prostate cancer treated with external beam irradiation

    International Nuclear Information System (INIS)

    Purpose: We reviewed our institution's experience in treating patients with clinically localized prostate cancer with external beam irradiation (RT) to determine if previously analyzed clinical and treatment related prognostic factors affected outcome when biochemical control was used as an end-point to evaluate results. Materials and methods: Between 1 January 1987 and 31 December 1991, 470 patients with clinically localized prostate cancer were treated with external beam RT using localized prostate fields at William Beaumont Hospital. Biochemical control was defined as PSA nadir ≤1.5 ng/ml within 1 year of treatment. After achieving nadir, if two consecutive increases of PSA were noted, the patient was scored a failure at the time of the first increase. Prognostic factors, including the total number of days in treatment, the method of diagnosis, a history of any pretreatment transurethral resection of the prostate (TURP) and the type of boost were analyzed. Results: Median follow-up was 48 months. No statistically significant difference in rates of biochemical control were noted for treatment time, overall time (date of biopsy to completion of RT), history of any pretreatment TURP, history of diagnosis by TURP, or boost techniques. Patients diagnosed by TURP had a significant improvement in the overall rate of biochemical control (P < 0.03) compared to transrectal/transperineal biopsy. The 5-year actuarial rates were 58 versus 39%, respectively. This improvement was not evident when pretreatment PSA, T stage, or Gleason score were controlled for. On multivariate analysis, no variable was associated with outcome. When analysis was limited to a more favorable group of patients (T1/T2 tumors, pretreatment PSA ≤20 ng/ml and Gleason score <7), none of these variables were significantly predictive of biochemical control when controlling for pretreatment PSA, T stage and Gleason score. Conclusions: No significant effect of treatment time, overall time, pretreatment

  12. Evaluation of Current Consensus Statement Recommendations for Accelerated Partial Breast Irradiation: A Pooled Analysis of William Beaumont Hospital and American Society of Breast Surgeon MammoSite Registry Trial Data

    International Nuclear Information System (INIS)

    Purpose: To determine whether the American Society for Radiation Oncology (ASTRO) Consensus Statement (CS) recommendations for accelerated partial breast irradiation (APBI) are associated with significantly different outcomes in a pooled analysis from William Beaumont Hospital (WBH) and the American Society of Breast Surgeons (ASBrS) MammoSite® Registry Trial. Methods and Materials: APBI was used to treat 2127 cases of early-stage breast cancer (WBH, n=678; ASBrS, n=1449). Three forms of APBI were used at WBH (interstitial, n=221; balloon-based, n=255; or 3-dimensional conformal radiation therapy, n=206), whereas all Registry Trial patients received balloon-based brachytherapy. Patients were divided according to the ASTRO CS into suitable (n=661, 36.5%), cautionary (n=850, 46.9%), and unsuitable (n=302, 16.7%) categories. Tumor characteristics and clinical outcomes were analyzed according to CS group. Results: The median age was 65 years (range, 32-94 years), and the median tumor size was 10.0 mm (range, 0-45 mm). The median follow-up time was 60.6 months. The WBH cohort had more node-positive disease (6.9% vs 2.6%, P<.01) and cautionary patients (49.5% vs 41.8%, P=.06). The 5-year actuarial ipsilateral breast tumor recurrence (IBTR), regional nodal failure (RNF), and distant metastasis (DM) for the whole cohort were 2.8%, 0.6%, 1.6%. The rate of IBTR was not statistically higher between suitable (2.5%), cautionary (3.3%), or unsuitable (4.6%) patients (P=.20). The nonsignificant increase in IBTR for the cautionary and unsuitable categories was due to increased elsewhere failures and new primaries (P=.04), not tumor bed recurrence (P=.93). Conclusions: Excellent outcomes after breast-conserving surgery and APBI were seen in our pooled analysis. The current ASTRO CS guidelines did not adequately differentiate patients at an increased risk of IBTR or tumor bed failure in this large patient cohort

  13. Strategic analysis

    OpenAIRE

    Popovová, Šárka

    2015-01-01

    The aim of this bachelor thesis is to define basic methods, which are used for the preparation of a business strategy and to use those methods in a real situation. The theoretical part describes the methodology of external and internal analysis. The practical part then applies single methods such as PEST, VRIO, Porter`s five forces and value chain in order to define competitive advantages of Dr. Popov company. At the end of the Bachelor thesis will be assessment of the current situation and s...

  14. Vector analysis

    CERN Document Server

    Newell, Homer E

    2006-01-01

    When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e

  15. Poetic Analysis

    DEFF Research Database (Denmark)

    Nielsen, Kirsten

    2010-01-01

    The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonance...... and alliteration, parallelism and the widely use of imagery creates coherence in the psalm but at the same time ambiguity. According to the heading, the psalm is a song of ascents, but it can be read both as a psalm for pilgrimage and as a psalm of trust. The metaphors can be understood both as metaphors...

  16. Vector analysis

    CERN Document Server

    Brand, Louis

    2006-01-01

    The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou

  17. Grammar Analysis

    Institute of Scientific and Technical Information of China (English)

    PEI Yuan-yuan

    2013-01-01

      As the development of the global communication all over the world, English has become the international language for the global communication. Furthermore, speaking as a skill in English language learning becomes more and more important. As it is known, spoken English has specific features, thus to gain an explicit understanding of the features data analysis is helpful. Moreover it is useful for language teacher to make development in teaching to satisfy learner’s needs. Grammatical and phonological are the two remarkable aspects and specific features in spoken language therefore to discover elements into these aspects seems helpful to teaching spoken Enlgish.

  18. Sequential analysis

    CERN Document Server

    Wald, Abraham

    2013-01-01

    In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio

  19. Understanding analysis

    CERN Document Server

    Abbott, Stephen

    2015-01-01

    This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...

  20. Permanent interstitial low-dose-rate brachytherapy for patients with low risk prostate cancer. An interim analysis of 312 cases

    Energy Technology Data Exchange (ETDEWEB)

    Badakhshi, Harun; Graf, Reinhold; Budach, Volker; Wust, Peter [University Hospital Berlin, Department for Radiation Oncology of Charite School of Medicine, Berlin (Germany)

    2015-04-01

    The biochemical relapse-free survival (bRFS) rate after treatment with permanent iodine-125 seed implantation (PSI) or combined seeds and external beam radiotherapy (COMB) for clinical stage T1-T2 localized prostate cancer is a clinically relevant endpoint. The goal of this work was to evaluate the influence of relevant patient- and treatment-related factors. The study population comprised 312 consecutive patients treated with permanent seed implantation. All patients were evaluable for analysis of overall survival (OS) and disease-specific survival (DSS), 230 for bRFS, of which 192 were in the PSI group and 38 in the COMB group. The prescribed minimum peripheral dose was 145 Gy for PSI, for COMB 110 Gy implant and external beam radiotherapy of 45 Gy. The median follow-up time was 33 months (range 8-66 months). bRFS was defined as a serum prostate-specific antigen (PSA) level ≤ 0.2 ng/ml at last follow-up. Overall, the actuarial bRFS at 50 months was 88.4 %. The 50-month bRFS rate for PSI and COMB was 90.9 %, and 77.2 %, respectively. In the univariate analysis, age in the categories ≤ 63 and > 63 years (p < 0.00), PSA nadir (≤ 0.5 ng/ml and > 0.5 ng/ml) and PSA bounce (yes/no) were the significant predicting factors for bRFS. None of the other patient and treatment variables (treatment modality, stage, PSA, Gleason score, risk group, number of risk factors, D90 and various other dose parameters) were found to be a statistically significant predictor of 50-month bRFS. The biochemical failure rates were low in this study. As a proof of principle, our large monocenteric analysis shows that low-dose-rate brachytherapy is an effective and safe procedure for patients with early stage prostate cancer. (orig.) [German] Das biochemisch rezidivfreie Ueberleben (bRFS) nach der Brachytherapie mit permanenter Iod-125-Seed-Implantation (PSI) oder in Kombination mit externer Radiotherapie (COMB) ist beim Patienten mit fruehem Prostatakarzinom (T1/T2) ein relevanter

  1. 501 men irradiated for clinically localized prostate cancer (1987 - 1995): preliminary analysis of the experience at UCSF and affiliated facilities

    International Nuclear Information System (INIS)

    NHT on FPF. Results: As has been previously demonstrated by others, the P-PSA was the single most important prognostic factor. The actuarial incidence of freedom from FPF at 4 years was 90% for patients with a PSA 6 and a PSA 71.5 Gy compared to 71.5 Gy, the use of NHT was associated with an improvement in the FPF at 4 years (70% vs 50%, p 71.5 Gy. Conclusions: This preliminary analysis suggests that the use of NHT is associated with an improved FPF and some subsets of patients may benefit from being treated to > 71.5 Gy. Longer follow-up will be required to assess the impact of these factors on survival

  2. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  3. Matrix analysis

    CERN Document Server

    Bhatia, Rajendra

    1997-01-01

    A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu­ ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe­ matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...

  4. Pericardial Fluid Analysis

    Science.gov (United States)

    ... Home Visit Global Sites Search Help? Pericardial Fluid Analysis Share this page: Was this page helpful? Formal name: Pericardial Fluid Analysis Related tests: Pleural Fluid Analysis , Peritoneal Fluid Analysis , ...

  5. Peritoneal Fluid Analysis

    Science.gov (United States)

    ... Home Visit Global Sites Search Help? Peritoneal Fluid Analysis Share this page: Was this page helpful? Formal name: Peritoneal Fluid Analysis Related tests: Pleural Fluid Analysis , Pericardial Fluid Analysis , ...

  6. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  7. System analysis and design

    International Nuclear Information System (INIS)

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  8. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  9. [Allergen analysis].

    Science.gov (United States)

    Röder, Martin; Weber, Wolfgang

    2016-07-01

    The fundamental requirement when testing for and ensuring compliance with legally required labelling regulations is the reliable analysis of food allergens. This can be carried out by means of either DNA (deoxyribonucleic acid) or protein detection. Protein detection has the advantage of directly detecting the allergenic component and can currently be carried out using immunological (enzyme-linked immunosorbent assay [ELISA])/lateral flow devices [LFD]) or mass spectrometry-based techniques. DNA detection is indirect, but allows the presence of food allergens to be validated through the use of another marker. Each method has its pros and cons, which have to be considered on a case-by-case basis. ELISA is quantitative, quick and easy to carry out and has high sensitivity. LFD testing is ideal for industrial applications, as the tests can be carried out on-site. Both antibody-based tests may have problems with processed foods and false positive results. Mass-spectrometric techniques show a lot of promise, but are currently still time-consuming and complex to carry out. They also run into problems with processed foods and their degree of sensitivity is matrix and parameter dependent. For these reasons, this technique is only occasionally used. Polymerase chain reaction (PCR) provides the highest specificity and, depending on the target sequence, a very good to good level of sensitivity. Despite the high stability of DNA, PCR is still subject to the influence of processing and matrix related factors. Due to natural variation and production-related changes in the structures relevant in the process of detection, all methods exhibit a relatively high level of uncertainty of measurement. At present, there is no method which provides the absolute correct quantification. However, by means of laboratory-based analyses it is possible to calibrate for the allergen in question and thus be able to make reliable measurements using methods that are already available. PMID

  10. Forensic activation analysis

    International Nuclear Information System (INIS)

    Basic principles of neutron activation analysis are outlined. Examples of its use in police science include analysis for gunshot residues, toxic element determinations and multielement comparisons. Advantages of neutron activation analysis over other techniques are described. (R.L.)

  11. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  12. Effectiveness of brachytherapy in treating carcinoma of the vulva

    International Nuclear Information System (INIS)

    Purpose: Radical radiotherapeutic management of vulvar cancer often incorporates brachytherapy as a portion of the treatment regimen. However, few studies using this modality alone to manage vulvar cancer have been published. Methods and Materials: Thirty four patients were treated with iridium-192 (192Ir) brachytherapy for vulvar cancer between 1975 and 1993 at Centre Alexis Vautrin. Twenty-one patients were treated at first presentation when surgery was contraindicated or declined. Of these patients, 12 had International Federation of Gynecology and Obstetrics Classification Stage III or IV disease, 8 were Stage II, 1 was Stage I, and 1 was Stage 0. Thirteen patients were treated for recurrent disease. Paris system rules for implantation and dose prescription were followed. The median reference dose was 60 Gy (range 53 to 88 Gy). At the time of analysis, 10 of 34 patients were alive. Median follow-up in these 10 patients was 31 months (range: 21 months to 107 months). Fourteen of the 24 deaths were from causes other than vulvar cancer. Results: Kaplan-Meier actuarial 5-year local control was 47% (95% confidence interval (CI) = 23 to 73%) and 5-year actuarial loco-regional control was 45% (95% CI = 21 to 70%). Kaplan-Meier actuarial 5-year disease-specific survival was 56% (95% CI = 33 to 76%) and actuarial 5-year survival was 29% (95% CI = 15 to 49%). Median time to death was 14 months. Subset analysis revealed a higher actuarial 5-year local control in patients treated at first presentation than those treated for recurrence (80 vs. 19%, log rank, p = 0.04). Similarly, actuarial 5-year loco-regional control was higher in patients treated at first presentation (80 vs. 16%, log rank, p 0.01). The two groups did not differ significantly in disease-specific or overall survival. The actuarial 5-year disease specific survival of 56% is somewhat less than the expected 5-year disease-specific survival after surgery in a group having a similar proportion of early stage

  13. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  14. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  15. Critical Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    杜梅香

    2006-01-01

    This paper is about the discourse analysis and illustrate the approach to analysis the Critical discourses and the discourses about the educational situation of China. And it also includes the condensed theoretical support of the Critical discourse analysis and analysis of the sample I of the discourses between an illiterate person and the literate.

  16. EDA noise analysis

    International Nuclear Information System (INIS)

    Using orCAD Pspice EDA software, the circuit simulation and the analysis such as transient analysis, noise analysis, temperature analysis, are made for charge-sensitive preamplifier. By calculation and comparison, the result shows circuit noise responses according to the temperature changes. (authors)

  17. Total skin electron irradiation for mycosis fungoides: failure analysis and prognostic factors

    International Nuclear Information System (INIS)

    From 1970 to 1980, 106 patients with mycosis fungoides received total skin electron irradiation to full tolerance. The majority received 30 Gy of 3 MeV electrons in 12 treatments over three weeks. Eighty-eight patients had received prior therapy. At five years, actuarial survival is 66.7% and disease-free survival 21.4%. The median time to relapse is 12 months; prolonged survival is seen only with complete response. In advanced stages, complete response is more likely with doses over 25 Gy (80 vs 50%). First recurrences were predominently in sites of previous involvement. Death resulted mainly from extracutaneous dissemination or failure to induce remission. Complications of the therapy are discussed

  18. Total skin electron irradiation for mycosis fungoides: failure analysis and prognostic factors

    Energy Technology Data Exchange (ETDEWEB)

    Tadros, A.A.M.; Tepperman, B.S.; Hryniuk, W.M.; Peters, V.G.; Rosenthal, D.; Roberts, J.T.; Path, C.B.; Figueredo, A.T.

    1983-09-01

    From 1970 to 1980, 106 patients with mycosis fungoides received total skin electron irradiation to full tolerance. The majority received 30 Gy of 3 MeV electrons in 12 treatments over three weeks. Eighty-eight patients had received prior therapy. At five years, actuarial survival is 66.7% and disease-free survival 21.4%. The median time to relapse is 12 months; prolonged survival is seen only with complete response. In advanced stages, complete response is more likely with doses over 25 Gy (80 vs 50%). First recurrences were predominently in sites of previous involvement. Death resulted mainly from extracutaneous dissemination or failure to induce remission. Complications of the therapy are discussed.

  19. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  20. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  1. Prone breast radiotherapy in early-stage breast cancer: a preliminary analysis

    International Nuclear Information System (INIS)

    Purpose: Women with large breasts have marked dose inhomogeneity and often an inferior cosmetic outcome when treated with breast conservation compared to smaller-sized patients. We designed a prone breast board, which both minimizes breast separation and irradiated lung or heart volume. We report feasibility, cosmesis, and preliminary local control and survival for selected women with Stage 0-II breast cancer. Materials and Methods: Fifty-six patients with clinical Stage 0-II breast cancer were treated with lumpectomy and breast irradiation utilizing a prototype prone breast board. A total of 59 breasts were treated. Indications for treatment in the prone position were large or pendulous breast size (n = 57), or a history of cardiopulmonary disease (n = 2). The median bra size was 41D (range, 34D-44EE). Cosmesis was evaluated on a 1-10 (worst-to-best) scale. Results: Acute toxicity included skin erythema (80% of patients experienced Grade I or Grade II erythema), breast edema (72% of patients experienced mild edema), pruritus (20% of patients), and fatigue (20% of patients reported mild fatigue). One patient required a treatment break. The only late toxicity was related to long-term cosmesis. The mean overall cosmesis score for 53 patients was 9.37 (range, 8-10). Actuarial 3- and 5-year local control rates are 98%. Actuarial overall survival at 3 and 5 years are 98% and 94%. Conclusion: Our data indicate that treating selected women with prone breast radiotherapy is feasible and tolerated. The approach results in excellent cosmesis, and short-term outcome is comparable to traditional treatment techniques. This technique offers an innovative alternative to women who might not otherwise be considered candidates for breast conservation

  2. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  3. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  4. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  5. Circuit analysis for dummies

    CERN Document Server

    Santiago, John

    2013-01-01

    Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help

  6. Time complexity analysis of genetic- fuzzy system for disease diagnosis

    Directory of Open Access Journals (Sweden)

    Ephzibah.E.P

    2011-08-01

    Full Text Available A new generation of tools and techniques are needed for finding interesting patterns in the data and discovering useful knowledge. Especially, Medical knowledge consists of a combination of structural information about known biological facts and probabilistic or actuarial information about exposures tohazards and recovery rates. Probabilistic information is especially difficult to use, as it requires constant maintenance and it usually comes in the form of study results which are not ideally suited for making individual predictions. Patterns summarizing mutual associations between class decisions and attributevalues in a pre-classified database provide insight into the significance of attributes and are also useful in classificatory knowledge. The proposed work is an efficient method to extract significant attributes from a database. Reducing the features or attributes enhances the quality of knowledge extracted and also thespeed of computation. In this paper the design of a hybrid algorithm for heart disease diagnosis usingeffective and efficient genetic algorithm and fuzzy logic is implemented. The proposed work analyses the time complexity of genetic- fuzzy system.

  7. Endometrial carcinoma: stage I. A retrospective analysis of 262 patients.

    Science.gov (United States)

    De Palo, G; Kenda, R; Andreola, S; Luciani, L; Musumeci, R; Rilke, F

    1982-08-01

    From 1969 to 1977, 420 patients with endometrial carcinoma were observed and treated at the National Tumor Institute of Milan. Total abdominal hysterectomy and bilateral salpingo-oophorectomy were performed in 351. After careful clinical and pathologic review, 262 patients were classified as having stage I disease. Further treatment included post-operative radium therapy to the vaginal vault. There were 247 cases with adenocarcinoma, 10 with adenoacanthoma, and 5 with adenosquamous or clear cell carcinoma. Of 257 cases with adenocarcinoma or adenoacanthoma, 63 were grade 1, 161 grade 2, and 33 grade 3. Of the total series, only 41 cases had disease limited to the mucosal surface. The 5-year actuarial survival was 91.4% and the recurrence-free survival was 93.4%. The case material was evaluated according to the risk factors, and results were 1) premenopausal patients had a better prognosis (100% recurrence-free survival versus 92.8% for postmenopausal women, P = .003); 2) length of the uterine cavity was not a significant prognostic factor; 3) myometrial invasion alone was not prognostic but correlated with grade of tumor; 4) the grade of the tumor was an important determinant of recurrence (grade 1 98% recurrence-free survival, grade 2 95%, grade 3 79%). With the described therapy, vaginal recurrences were absent. The recurrences were distant in 20% and local with or without distant metastases in 80%.

  8. Event history analysis: overview

    DEFF Research Database (Denmark)

    Keiding, Niels

    2001-01-01

    Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes......Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes...

  9. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  10. Synovial fluid analysis

    Science.gov (United States)

    Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...

  11. Buildings Sector Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hostick, Donna J.; Nicholls, Andrew K.; McDonald, Sean C.; Hollomon, Jonathan B.

    2005-08-01

    A joint NREL, ORNL, and PNNL team conducted market analysis to help inform DOE/EERE's Weatherization and Intergovernmental Program planning and management decisions. This chapter presents the results of the market analysis for the Buildings sector.

  12. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  13. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  14. Biological sequence analysis

    DEFF Research Database (Denmark)

    Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose;

    This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene......This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis...

  15. Analysis of Business Environment

    OpenAIRE

    Horáková, Eva

    2012-01-01

    This bachelor's theses deals with analysis of the entrepreneurial environment of the company ALBO okna - dveře, s.r.o. . With the help of SWOT analysis, passportisation and Porter's analysis of five rival companies an analysis of the entrepreneurial environment, in which the company ALBO okna - dveře, s.r.o. is situated, will be carried out. This theses also comprises the evaluation of opportunities and risks resulting from the given entrepreneurial environment.

  16. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  17. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  18. Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, William R.

    1968-01-01

    In activation analysis, a sample of an unknown material is first irradiated (activated) with nuclear particles. In practice these nuclear particles are almost always neutrons. The success of activation analysis depends upon nuclear reactions which are completely independent of an atom's chemical associations. The value of activation analysis as a research tool was recognized almost immediately upon the discovery of artificial radioactivity. This book discusses activation analysis experiments, applications and technical considerations.

  19. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  20. Discourse analysis and Foucault's

    Directory of Open Access Journals (Sweden)

    Jansen I.

    2008-01-01

    Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.

  1. Extending Scalasca's analysis features

    OpenAIRE

    Lorenz, Daniel; Böhme, David; Mohr, Bernd; Strube, Alexandre; Szebenyi, Zoltan

    2013-01-01

    Scalasca is a performance analysis tool, which parses the trace of an application run for certain patterns that indicate performance inefficiencies. In this paper, we present recently developed new features in Scalasaca. In particular, we describe two newly implemented analysis methods: the root cause analysis which tries to identify the cause of a delay and the critical path analysis, which analyses the path of execution that determines the application runtime. Furthermore, we present time-s...

  2. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  3. Data analysis for chemistry

    CERN Document Server

    Hibbert, DBrynn

    2005-01-01

    Based on D Brynn Hibbert''s lectures on data analysis to undergraduates and graduate students, this book covers topics including measurements, means and confidence intervals, hypothesis testing, analysis of variance, and calibration models. It is meant as an entry level book targeted at learning and teaching undergraduate data analysis.

  4. Practical data analysis

    CERN Document Server

    Cuesta, Hector

    2013-01-01

    Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.

  5. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  6. Strategic Analysis Overview

    Science.gov (United States)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  7. Foundations of mathematical analysis

    CERN Document Server

    Johnsonbaugh, Richard

    2010-01-01

    This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss

  8. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  9. Gabriel Data Analysis (GDA): from data analysis to food analysis

    OpenAIRE

    OLIVE, Gilles

    2011-01-01

    GDA is a software belonging to the Gabriel package and is devoted to data analysis. Year after year some new features have been introduced and the latest introductions are more dedicated to food. GDA is built around modules and we describe here the most widely used in food chemistry. GDA can be obtained free of charge upon request.

  10. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  11. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  12. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  13. Markov processes for stochastic modeling

    CERN Document Server

    Ibe, Oliver

    2008-01-01

    Markov processes are used to model systems with limited memory. They are used in many areas including communications systems, transportation networks, image segmentation and analysis, biological systems and DNA sequence analysis, random atomic motion and diffusion in physics, social mobility, population studies, epidemiology, animal and insect migration, queueing systems, resource management, dams, financial engineering, actuarial science, and decision systems. This book, which is written for upper level undergraduate and graduate students, and researchers, presents a unified presentat

  14. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  15. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  16. Monju plant dynamics analysis

    International Nuclear Information System (INIS)

    The heat transport systems of MONJU are three main heat transport loops, each loop consist of the primary, the secondary loop and the water-steam system, in addition, the auxiliary cooling system. These systems are under the influence one another on plant transient. So it is important to evaluate the flow and heat characteristics of the heat transport systems on calculating plant transient. We made the plant dynamic analysis codes of MONJU to calculate the plant transient analysis and evaluate the plant characteristics by the disturbance on the on-power operation and the performance of the plant control systems. In this paper, one of the main plant dynamic simulation code of MONJU, the calculation conditions on analysis, the plant safety analysis, the plant stability analysis and the plant thermal transient analysis are discribed. (author)

  17. 3RDWHALE STRATEGIC ANALYSIS

    OpenAIRE

    Johnson, Andrew

    2009-01-01

    This essay provides a detailed strategic analysis of 3rdWhale, a Vancouver-based start-up in the sustainability sector, along with an analysis of the smartphone applications industry. Porter?s five forces model is used to perform an industry analysis of the smartphone application industry and identify key success factors for application developers. Using the identified factors, 3rdWhale is compared to its indirect competitors to identify opportunities and threats and produce a range of strate...

  18. Marketing Mix Analysis

    OpenAIRE

    Procházková, Gabriela

    2014-01-01

    The Bachelor thesis focuses on the analysis of the marketing mix which was applied to a company called Bill Ltd. The theoretical part focuses on the general description of the marketing mix and its individual elements. In the practical part is mentioned the history and present of the company Billa Ltd. It also includes a specific analysis of individual elements of the marketing mix of the company Bill Ltd. and analysis of the results of the electronic questioning, that focuses on br...

  19. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  20. Mastering Clojure data analysis

    CERN Document Server

    Rochester, Eric

    2014-01-01

    This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.

  1. LULU analysis program

    International Nuclear Information System (INIS)

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday

  2. Grammar flow analysis

    OpenAIRE

    Möncke, Ulrich R.

    1986-01-01

    This paper specifies the theoretical basis for the implementation of different generators of the OPTRAN system. Grammar Flow analysis transports the techniques of data flow analysis to the meta level of compiler construction. The analogon to the states in data flow analysis are the syntax trees together with some information, which is associated with trees by propagation functions. One example is the association of characteristic graphs, another example the association of sets of matching tre...

  3. Deep Survival Analysis

    OpenAIRE

    Ranganath, Rajesh; Perotte, Adler; Elhadad, Noémie; Blei, David

    2016-01-01

    The electronic health record (EHR) provides an unprecedented opportunity to build actionable tools to support physicians at the point of care. In this paper, we investigate survival analysis in the context of EHR data. We introduce deep survival analysis, a hierarchical generative approach to survival analysis. It departs from previous approaches in two primary ways: (1) all observations, including covariates, are modeled jointly conditioned on a rich latent structure; and (2) the observation...

  4. Sparse discriminant analysis

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder; Hastie, Trevor; Witten, Daniela;

    2011-01-01

    commonplace in biological and medical applications. In this setting, a traditional approach involves performing feature selection before classification. We propose sparse discriminant analysis, a method for performing linear discriminant analysis with a sparseness criterion imposed such that classification...... and feature selection are performed simultaneously. Sparse discriminant analysis is based on the optimal scoring interpretation of linear discriminant analysis, and can be extended to perform sparse discrimination via mixtures of Gaussians if boundaries between classes are nonlinear or if subgroups...... are present within each class. Our proposal also provides low-dimensional views of the discriminative directions. © 2011 American Statistical Association and the American Society for Qualitys....

  5. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  6. Textile Technology Analysis Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...

  7. Circuit analysis with Multisim

    CERN Document Server

    Baez-Lopez, David

    2011-01-01

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo

  8. Chemical Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...

  9. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  10. Finite element analysis

    CERN Document Server

    2010-01-01

    Finite element analysis is an engineering method for the numerical analysis of complex structures. This book provides a bird's eye view on this very broad matter through 27 original and innovative research studies exhibiting various investigation directions. Through its chapters the reader will have access to works related to Biomedical Engineering, Materials Engineering, Process Analysis and Civil Engineering. The text is addressed not only to researchers, but also to professional engineers, engineering lecturers and students seeking to gain a better understanding of where Finite Element Analysis stands today.

  11. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  12. Space Weather Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Space Weather Analysis archives are model output of ionospheric, thermospheric and magnetospheric particle populations, energies and electrodynamics

  13. International Market Analysis

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2009-01-01

    The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....

  14. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  15. Power system analysis

    CERN Document Server

    Murty, PSR

    2007-01-01

    Power system analysis is a pre-requisite course for electrical engineering students. This book introduces concepts of a power system, network model faults and analysis and the primitive network stability. It also deals with graph theory relevant to various incidence matrices, building of network matrices and power flow studies. It further discusses with short circuit analysis, unbalanced fault analysis and power system stability problems, such as, steady state stability, transient stability and dynamic stability. Salient Features: Number of worked examples are followed after explaining theory

  16. HIRENASD analysis Information Package

    Data.gov (United States)

    National Aeronautics and Space Administration — Updated November 2, 2011 Contains summary information and analysis condition details for the Aeroelastic Prediction Workshop Information plotted in this package is...

  17. Thermogravimetric Analysis Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....

  18. Risk Analysis of Marine Structures

    DEFF Research Database (Denmark)

    Hansen, Peter Friis

    1998-01-01

    Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated.......Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated....

  19. Application of SWOT analysis.

    Science.gov (United States)

    Casebeer, A

    SWOT analysis is an effective and simple planning technique which addresses one aspect of many strategic planning processes. Given the complex nature of modern health care systems, the ability to use this type of technique can enable health professionals to participate more fully in the analysis and implementation of health care improvement.

  20. Advanced Analysis Environments - Summary

    International Nuclear Information System (INIS)

    This is a summary of the panel discussion on Advanced Analysis Environments. Rene Brun, Tony Johnson, and Lassi Tuura shared their insights about the trends and challenges in analysis environments. This paper contains the initial questions, a summary of the speakers' presentation, and the questions asked by the audience

  1. Structural analysis for Diagnosis

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Blanke, M.

    2001-01-01

    Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...

  2. Laboratory analysis of stardust.

    Science.gov (United States)

    Zinner, Ernst

    2013-02-01

    Tiny dust grains extracted from primitive meteorites are identified to have originated in the atmospheres of stars on the basis of their anomalous isotopic compositions. Although isotopic analysis with the ion microprobe plays a major role in the laboratory analysis of these stardust grains, many other microanalytical techniques are applied to extract the maximum amount of information.

  3. Economic Analysis of Law

    OpenAIRE

    Louis Kaplow; Steven Shavell

    1999-01-01

    This entry for the forthcoming The New Palgrave Dictionary of Economics (Second Edition) surveys the economic analysis of five primary fields of law: property law; liability for accidents; contract law; litigation; and public enforcement and criminal law. It also briefly considers some criticisms of the economic analysis of law.

  4. FOOD RISK ANALYSIS

    Science.gov (United States)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  5. Proximate Analysis of Coal

    Science.gov (United States)

    Donahue, Craig J.; Rais, Elizabeth A.

    2009-01-01

    This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…

  6. Static Analysis of IMC

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik

    2012-01-01

    -type synchronisation – in particular, on our variant of IMC with a more permissive syntax, i.e. with a possibility to start a bounded number of new processes. We prove that the defined Pathway Analysis captures all the properties of the systems, i.e. is precise. The results of the Pathway Analysis can be therefore...

  7. NOTATIONAL ANALYSIS OF SPORT

    Directory of Open Access Journals (Sweden)

    Ian M. Franks

    2004-06-01

    Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition

  8. Northern blotting analysis

    DEFF Research Database (Denmark)

    Josefsen, Knud; Nielsen, Henrik

    2011-01-01

    Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membran...

  9. Educational Cost Analysis.

    Science.gov (United States)

    Flynn, Donald L.

    Traditional approaches to the cost analysis of educational programs involve examining annual budgets. Such approaches do not properly consider the cost of either new capital expenditures or the current value of previously purchased items. This paper presents the methodology for a new approach to educational cost analysis that identifies the actual…

  10. Analysis of Design Documentation

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1998-01-01

    has been established where we seek to identify useful design work patterns by retrospective analyses of documentation created during design projects. This paper describes the analysis method, a tentatively defined metric to evaluate identified work patterns, and presents results from the first...... analysis accomplished....

  11. Electric field analysis

    CERN Document Server

    Chakravorti, Sivaji

    2015-01-01

    This book prepares newcomers to dive into the realm of electric field analysis. The book details why one should perform electric field analysis and what are its practical implications. It emphasizes both the fundamentals and modern computational methods of electric machines. The book covers practical applications of the numerical methods in high voltage equipment, including transmission lines, power transformers, cables, and gas insulated systems.

  12. Spool assembly support analysis

    International Nuclear Information System (INIS)

    This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met

  13. Analysis of Industrial Wastewaters.

    Science.gov (United States)

    Mancy, K. H.; Weber, W. J., Jr.

    A comprehensive, documented discussion of certain operating principles useful as guidelines for the analysis of industrial wastewaters is presented. Intended primarily for the chemist, engineer, or other professional person concerned with all aspects of industrial wastewater analysis, it is not to be considered as a substitute for standard manuals…

  14. Analysis in usability evaluations

    DEFF Research Database (Denmark)

    Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper Anders Søren

    2010-01-01

    While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations...

  15. Dynamic Drawing Analysis

    OpenAIRE

    Liberdová, I.

    2015-01-01

    This article is focused on the dynamic drawing analysis. It deals with temporal segmentation methods for hand-drawn pictures. The automatic vectorization of segmentation results is considered as well. Dynamic drawing analysis may significantly improves tracing drawing test utilization in the clinical physiology trials.

  16. Application of SWOT analysis.

    Science.gov (United States)

    Casebeer, A

    SWOT analysis is an effective and simple planning technique which addresses one aspect of many strategic planning processes. Given the complex nature of modern health care systems, the ability to use this type of technique can enable health professionals to participate more fully in the analysis and implementation of health care improvement. PMID:8472105

  17. Time series analysis.

    NARCIS (Netherlands)

    2013-01-01

    Time series analysis can be used to quantitatively monitor, describe, explain, and predict road safety developments. Time series analysis techniques offer the possibility of quantitatively modelling road safety developments in such a way that the dependencies between the observations of time series

  18. Zen and Behavior Analysis

    Science.gov (United States)

    Bass, Roger

    2010-01-01

    Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless--a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to…

  19. Northern blotting analysis

    DEFF Research Database (Denmark)

    Josefsen, Knud; Nielsen, Henrik

    2011-01-01

    Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... closing the gap to the more laborious nuclease protection experiments....

  20. Per Object statistical analysis

    DEFF Research Database (Denmark)

    Groom, Geoffrey Brian

    2008-01-01

    This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Obje...

  1. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysi...

  2. Gait analysis: clinical facts.

    Science.gov (United States)

    Baker, Richard; Esquenazi, Alberto; Benedetti, Maria G; Desloovere, Kaat

    2016-08-01

    Gait analysis is a well-established tool for the quantitative assessment of gait disturbances providing functional diagnosis, assessment for treatment planning, and monitoring of disease progress. There is a large volume of literature on the research use of gait analysis, but evidence on its clinical routine use supports a favorable cost-benefit ratio in a limited number of conditions. Initially gait analysis was introduced to clinical practice to improve the management of children with cerebral palsy. However, there is good evidence to extend its use to patients with various upper motor neuron diseases, and to lower limb amputation. Thereby, the methodology for properly conducting and interpreting the exam is of paramount relevance. Appropriateness of gait analysis prescription and reliability of data obtained are required in the clinical environment. This paper provides an overview on guidelines for managing a clinical gait analysis service and on the principal clinical domains of its application: cerebral palsy, stroke, traumatic brain injury and lower limb amputation.

  3. IAC - INTEGRATED ANALYSIS CAPABILITY

    Science.gov (United States)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model

  4. K Basin safety analysis

    International Nuclear Information System (INIS)

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall

  5. K Basin safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Porten, D.R.; Crowe, R.D.

    1994-12-16

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall.

  6. Robust Sparse Analysis Regularization

    CERN Document Server

    Vaiter, Samuel; Dossal, Charles; Fadili, Jalal

    2011-01-01

    This paper studies the properties of L1-analysis regularization for the resolution of linear inverse problems. Most previous works consider sparse synthesis priors where the sparsity is measured as the L1 norm of the coefficients that synthesize the signal in a given dictionary. In contrast, the more general analysis regularization minimizes the L1 norm of the correlations between the signal and the atoms in the dictionary. The corresponding variational problem includes several well-known regularizations such as the discrete total variation and the fused lasso. We first prove that a solution of analysis regularization is a piecewise affine function of the observations. Similarly, it is a piecewise affine function of the regularization parameter. This allows us to compute the degrees of freedom associated to sparse analysis estimators. Another contribution gives a sufficient condition to ensure that a signal is the unique solution of the analysis regularization when there is no noise in the observations. The s...

  7. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  8. Functional data analysis

    CERN Document Server

    Ramsay, J O

    1997-01-01

    Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...

  9. The Pension Fund passes important milestones

    CERN Multimedia

    2012-01-01

    In this column, the Chairman of the Pension Fund Governing Board (PFGB) presents the Board's latest main decisions, initiatives and accomplishments to the Fund's members and beneficiaries.   Since my last report in October, the PFGB has passed several milestones in actuarial, technical and investment matters. The PFGB has completed an analysis of a request by the European Organisation for Astronomical Research in the Southern Hemisphere (ESO) to reduce the increased cost of pension insurance for new ESO recruits that has been caused by the increased CHF/€ exchange ratio. Currently the staff of ESO are admitted to the CERN Pension Fund, pursuant to a co-operation agreement between CERN and ESO dating back to 1968. This analysis assessed the actuarial, financial, administrative and legal implications, and is scheduled to be presented to the CERN Council and the Finance Committee in December. After an open tendering process the PFGB has selected Buck Consultants Limited...

  10. Statistical Analysis of Thermal Analysis Margin

    Science.gov (United States)

    Garrison, Matthew B.

    2011-01-01

    NASA Goddard Space Flight Center requires that each project demonstrate a minimum of 5 C margin between temperature predictions and hot and cold flight operational limits. The bounding temperature predictions include worst-case environment and thermal optical properties. The purpose of this work is to: assess how current missions are performing against their pre-launch bounding temperature predictions and suggest any possible changes to the thermal analysis margin rules

  11. Indicators of prognosis after liver transplantation in Chinese hepatocellular carcinoma patients

    Institute of Scientific and Technical Information of China (English)

    Jin Li; Lu-Nan Yan; Jian Yang; Zhe-Yu Chen; Bo Li; Yong Zeng; Tian-Fu Wen; Ji-Chun Zhao; Wen-Tao Wang; Jia-Yin Yang; Ming-Qing Xu; Yu-Kui Ma

    2009-01-01

    AIM:To identify prognostic factors of patients with hepatocellular carcinoma (HCC), who were treated by orthotopic liver transplantation (OLT).METHODS: From January 2000 to October 2006,165 patients with HCC underwent OLT. Various clinicopathological risk factors for actuarial and recurrencefree survival were identified using the Kaplan-Meier method with the log-rank test. The Cox proportional hazards model was used to identify independently predictive factors for actuarial and recurrence-free survival, which were used to propose new selection criteria. We compared the outcome of the subgroup patients meeting different criteria. Survival analysis was performed using the Kaplan-Meier method with the log-rank test.RESULTS: The median follow-up was 13.0 mo (2.8-69.5 mo). Overall, 1-, 2-, 3- and 5-year actuarial survival was 73.3%, 45.6%, 35.4% and 32.1%, respectively.One-, 2-, 3- and 5-year overall recurrencefree survival was 67.0%, 44.3%, 34.5% and 34.5%,respectively. In univariate analysis, number of tumors,total tumor size, lobar distribution, differentiation, macrovascular invasion, microvascular invasion, capsulation of the tumor, and lymph node metastasis were found to be associated significantly with actuarial and tumor-free survival. By means of using the multivariate Cox proportional hazards model, total tumor size and macrovascular invasion were found to be independent predictors of actuarial and tumor-free survival. When the selection criteria were expanded into the proposed criteria, there was no significant difference in 1-, 2-, 3- and 5-year actuarial and tumor-free survival of the 49 patients who met the proposed criteria (97.6%, 82.8%, 82.8% and 82.8%, and 90.7%, 82.8%, 68.8% and 68.8%, respectively)compared with that of patients who met the Milan or University of California, San Francisco (UCSF) criteria.CONCLUSION: Macrovascular invasion and total tumor diameter are the strongest prognostic factors.The proposed criteria do not adversely affect the

  12. Meaning from curriculum analysis

    Science.gov (United States)

    Finegold, Menahem; Mackeracher, Dorothy

    This paper reports on the analysis of science curricula carried out across Canada within the framework of the Second International Science Study (SISS). The organization of Canadian education in twelve autonomous educational jurisdictions is briefly described and problems are noted in relation to the analysis of curricula on a national scale. The international design for curriculum analysis is discussed and an alternative design, more suited to the diversity of science education in Canada, is introduced. The analysis of curriculum documents is described and three patterns which emerge from this analysis are identified. These derive from the concepts of commonality, specificity and prescriptiveness. Commonality relates to topics listed in curriculum guideline documents by a number of jurisdictions. Specificity refers to the richness of curriculum documents. Prescriptiveness is a measure of the extent to which jurisdictions do or do not make provision for local options in curriculum design. The Canadian analysis, using the concepts of the common curriculum, specificity and prescriptiveness, is described and research procedures are exemplified. Outcomes of curriculum analysis are presented in graphical form.

  13. From analysis to surface

    DEFF Research Database (Denmark)

    Bemman, Brian; Meredith, David

    In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing it with a “......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...... it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis...

  14. The Life Annuity Actuarial Present Value Models of Annuity Portfolio Insurance Based on Stochastic Interest Rate%随机利率下的生存年金组合精算现值模型

    Institute of Scientific and Technical Information of China (English)

    李长林; 陈敏; 周勇

    2007-01-01

    对随机利率做了相关分析,然后在假定市场上存在多种相互独立的随机投资利率的条件下,利用时间序列得到一种推广的随机利率模型,最后根据投资组合理论得出企业年金保险中多种生存年金组合的精算现值模型.

  15. Study on the life annuity actuarial present value models of annuity portfolio insurance%生存年金组合精算现值问题的研究

    Institute of Scientific and Technical Information of China (English)

    李长林

    2007-01-01

    企业生存年金是一种很重要的辅助性养老金,很多学者对此进行了研究.首先,根据已有的研究结果,利用时间序列得到一种推广的利率模型.然后,考虑市场上存在多种相互独立的随机投资利率的情况下,依据投资组合理论得出企业年金保险中多种生存年金组合的精算现值模型.

  16. 寿险精算中伤残免缴率的厘定及计算%Determination and calculation of the rate of waiver of premium for disability in life insurance actuarial calculation

    Institute of Scientific and Technical Information of China (English)

    惠军

    2001-01-01

    研究了寿险精算中因投保人(缴费人)伤残而获免缴续期保费的概率与其伤残率的区别和联系,定义了准确的伤残免缴率的概念,并给出了伤残免缴率的计算公式.在此基础上讨论了寿险精算中常见的在离散的情况下计算定期缴费的纯保费现值的计算方法,并给出一个实际例子.通过实例显示了一般伤残率和伤残免缴率的差异.

  17. Gross Premium Actuarial Models Include Inflation and Surrender Situation%考虑通货膨胀和退保情形的总保费精算模型

    Institute of Scientific and Technical Information of China (English)

    贺明田; 王传玉; 安琪

    2011-01-01

    为使寿险公司能根据实际情况及时合理地调整保费价格,在改进的传统总保费精算方法的基础上,给出了在随机利率模型和随机死亡率模型上定价净保费的方法,同时在考虑通货膨胀和退保情形的前提下尽可能地把寿险公司所涉及的费用都作独立的随机变量纳入寿险总保费的计算中,并给出了计算总保费的公式.数值算例表明了该法的可行性.%In order to enable the life insurance company to act in accordance with to the actual situation prompt reasonable adjustment insurance premium price,based on the improved traditional gross premium calculation method foundation,the article presents a method of net premium pricing under mortality and interest rates,both of which are stochastic,in gross premium calculation,inflation and surrender factors are considered as independent random variables.This paper gives a formula to calculate the gross premium.

  18. 寿险营业保费中附加保费精算模型的改进%The Improvement to Additional Premium Actuarial Models of Life Insurance

    Institute of Scientific and Technical Information of China (English)

    梁来存; 皮友静

    2006-01-01

    寿险公司目前常用的附加保费精算模型有固定比率模型、变动比率模型、三元素模型等三种,这些模型存在自身的优缺点.在此基础上新构建的附加保费精算模型,包含了每张保单的固定费用,区分了附加费用的种类,考虑了通货膨胀对未来经营管理费用的影响,从而可使寿险公司附加保费的计算公平、合理、适度.

  19. Benefit Reserve Actuarial Model for Portfolio of Semi-continuous Life Insurance Policies%半连续型寿险保单组的准备金精算模型

    Institute of Scientific and Technical Information of China (English)

    东明; 郭亚军; 杨怀东

    2004-01-01

    本文针对同质半连续型寿险保单组,分别建立了确定利率与随机利率准备金精算模型,通过对比分析,发现保单数的增加会降低死亡率风险,但不会减小利率风险.对于平均未来损失额的近似值,本文给出了其前二阶矩的一般表达式.

  20. 随机利率下保单组的生存年金精算模型%The Group's Life Annuity Actuarial Model under the Random Interest Rate

    Institute of Scientific and Technical Information of China (English)

    张莉

    2011-01-01

    This paper is concerned with the same group's life insurance policy. Under the random interest rate, we establish the models using the Wiener process and the Poisson process, give two numerical examples, and obtain the wholesale net premiums and the risks of termly life annuity. We find that wholesale net premiums of the termly life annuity reduced gradually with the increase of age, and at the same age, the wholesale net premiums of the life annuity reduced with the increase of the force of interest.%针对同质寿险保单组,在随机利率条件下,利用Wiener过程和Poisson过程联合建模,做出数值算例,给出随机利率下定期生存年金的趸缴纯保费及所承担的风险.通过分析发现:随着年龄的增大,定期生存年金趸缴纯保费逐渐降低;同一年龄的生存年金的趸缴纯保费随着利息力的增加而减少.

  1. Strategic analysis of the company

    OpenAIRE

    Matoušková, Irena

    2012-01-01

    Strategic analysis of the company In my thesis I developed a strategic analysis of the company Pacovské strojírny a.s. I describe the various methods of internal and external strategic analysis in the theoretical part. I followed the methods used in the practical part. In an internal strategic analysis, I focused on the identification of internal resources and capabilities, the financial analysis and the chain of creating value. External strategic analysis includes PEST analysis, Porter's fiv...

  2. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  3. NASA Enterprise Visual Analysis

    Science.gov (United States)

    Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck

    2007-01-01

    NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.

  4. Static analysis for blinding

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde; Nielson, Hanne Riis

    2006-01-01

    operation blinding. In this paper we study the theoretical foundations for one of the successful approaches to validating cryptographic protocols and we extend it to handle the blinding primitive. Our static analysis approach is based on Flow Logic; this gives us a clean separation between the specification...... of the analysis and its realisation in an automatic tool. We concentrate on the former in the present paper and provide the semantic foundation for our analysis of protocols using blinding - also in the presence of malicious attackers....

  5. Observations on risk analysis

    International Nuclear Information System (INIS)

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested

  6. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  7. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  8. Finite Discrete Gabor Analysis

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel

    2007-01-01

    Gabor analysis is a method for analyzing signals through the use of a set of basic building blocks. The building blocks consists of a certain function (the window) that is shifted in time and frequency. The Gabor expansion of a signal contains information on the behavior of the signal in certain...... discrete time/frequency and Gabor analysis. It is intended to be both an educational and a computational tool. The toolbox was developed as part of this Ph.D. project to provide a solid foundation for the field of computational Gabor analysis....

  9. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  10. Android malware and analysis

    CERN Document Server

    Dunham, Ken

    2014-01-01

    The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact

  11. Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    2004-01-01

    This chapter provides an introduction to automated chemical analysis, which essentially can be divided into two groups: batch assays, where the solution is stationary while the container is moved through a number of stations where various unit operations performed; and continuous-flow procedures......, where the system is stationary while the solution moves through a set of conduits in which all required manipulations are performed. Emphasis is placed on flow injection analysis (FIA) and its further developments, that is, sequential injection analysis (SIA) and the Lab-on-Valve (LOV) approach. Since...

  12. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  13. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  14. Wood Products Analysis

    Science.gov (United States)

    1990-01-01

    Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.

  15. Stereotactic body radiotherapy for centrally located stage I NSCLC. A multicenter analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schanne, Daniel H.; Nestle, Ursula; Grosu, Anca L. [Universitaetsklinik Freiburg, Klinik fuer Strahlenheilkunde, Freiburg (Germany); Allgaeuer, Michael [Barmherzige Brueder, Klinik fuer Strahlentherapie, Regensburg (Germany); Andratschke, Nicolaus; Molls, Michael [TU Muenchen, Klinik und Poliklinik fuer Strahlentherapie und Radiologische Onkologie, Muenchen (Germany); Appold, Steffen [Universitaetsklinikum Dresden, Klinik und Poliklinik fuer Strahlentherapie und Radioonkologie, Dresden (Germany); Dieckmann, Ute [Allgemeines Krankenhaus Wien, Univ. Klinik fuer Strahlentherapie, Wien (Austria); Ernst, Iris [Universitaetsklinikum Muenster, Klinik fuer Strahlentherapie, Muenster (Germany); Ganswindt, Ute [LMU Muenchen, Klinik und Poliklinik fuer Strahlentherapie und Radioonkologie, Muenchen (Germany); Holy, Richard [Universitaetsklinikum Aachen, Klinik fuer Strahlentherapie, Aachen (Germany); Nevinny-Stickel, Meinhard [Medizinischen Universitaet Innsbruck, Univ. Klinik fuer Strahlentherapie und Radioonkologie, Innsbruck (Austria); Semrau, Sabine [Universitaetsklinikum Erlangen, Strahlenklinik Erlangen, Erlangen (Germany); Sterzing, Florian [Universitaetsklinikum Heidelberg, Klinik fuer Radioonkologie und Strahlentherapie, Heidelberg (Germany); Wittig, Andrea [Philipps-Universitaet Marburg, Klinik fuer Strahlentherapie und Radioonkologie, Marburg (Germany); Guckenberger, Matthias [Universitaet Wuerzburg, Klinik und Poliklinik fuer Strahlentherapie, Wuerzburg (Germany)

    2014-08-27

    The purpose of this work is to analyze patterns of care and outcome after stereotactic body radiotherapy (SBRT) for centrally located, early-stage, non-small cell lung cancer (NSCLC) and to address the question of potential risk for increased toxicity in this entity. A total of 90 patients with centrally located NSCLC were identified among 613 cases in a database of 13 German and Austrian academic radiotherapy centers. The outcome of centrally located NSCLC was compared to that of cases with peripheral tumor location from the same database. Patients with central tumors most commonly presented with UICC stage IB (50 %), while the majority of peripheral lesions were stage IA (56 %). Average tumor diameters were 3.3 cm (central) and 2.8 cm (peripheral). Staging PET/CT was available for 73 and 74 % of peripheral and central tumors, respectively. Biopsy was performed in 84 % (peripheral) and 88 % (central) of cases. Doses varied significantly between central and peripheral lesions with a median BED{sub 10} of 72 Gy and 84 Gy, respectively (p < 0.001). Fractionation differed as well with medians of 5 (central) and 3 (peripheral) fractions (p < 0.001). In the Kaplan-Meier analysis, 3-year actuarial overall survival was 29 % (central) and 51 % (peripheral; p = 0.004) and freedom from local progression was 52 % (central) and 84 % (peripheral; p < 0.001). Toxicity after treatment of central tumors was low with no grade III/IV and one grade V event. Mortality rates were 0 and 1 % after 30 and 60 days, respectively. Local tumor control in patients treated with SBRT for centrally located, early-stage NSCLC was favorable, provided ablative radiation doses were prescribed. This was, however, not the case in the majority of patients, possibly due to concerns about treatment-related toxicity. Reported toxicity was low, but prospective trials are needed to resolve the existing uncertainties and to establish safe high-dose regimens for this cohort of patients. (orig.) [German] Ziel

  16. The Political Economy of Public Pensions: Pension Funding, Governance, and Fiscal Stress The Political Economy of Public Pensions: Pension Funding, Governance, and Fiscal Stress

    OpenAIRE

    Olivia Mitchell; Ping-Lung Hsin

    1994-01-01

    The purpose of this paper is to describe and evaluate how public sector defined benefit pension plans are managed, and to assess possible implications of different pension management styles for promised pension benefits. The authors explore Ihe actuarial and economic assumptions employed by public pension managers when they set funding targets, using a new survey of state and local pension plans in the United States. The analysis shows that key assumptions under the control of public pension ...

  17. A study of task uncertainty associated with public accounting firm services

    OpenAIRE

    Burkette, Gary D.

    1994-01-01

    Relative levels of task uncertainty associated with various CPA firm services were examined in this study. Additionally, tests to determine whether systematic variation occurs at the office or at the firm level were conducted. Multiple measures of task uncertainty were developed. Multiple analysis of variance techniques were used to analyze data drawn from audit, tax, actuarial and benefits consulting, and general business consulting engagements. Data was drawn from two office ...

  18. Pesticide Instrumental Analysis

    International Nuclear Information System (INIS)

    This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.

  19. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...

  20. Russian River Analysis

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This document is an analysis and summary of progress toward achieving the interim management objectives for the Russian River during the 1979 season. Additionally,...

  1. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  2. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  3. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  4. Energy Sector Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  5. Design and Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...

  6. Semen Analysis Test

    Science.gov (United States)

    ... may experience difficulties. Several factors can affect the sperm count or other semen analysis values, including use of alcohol, tobacco, ... the News Article Index About This Site Send Us Your Comments ...

  7. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  8. Coal - proximate analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-06-14

    This Standard establishes a practice for the proximate analysis of coal, that is, the coal is analysed for the content of moisture, ash and volatile matter; fixed carbon is calculated. The standard provides a basis for the comparison of coals.

  9. Particle Size Analysis.

    Science.gov (United States)

    Barth, Howard G.; Sun, Shao-Tang

    1989-01-01

    Presents a review of research focusing on scattering, elution techniques, electrozone sensing, filtration, centrifugation, comparison of techniques, data analysis, and particle size standards. The review covers the period 1986-1988. (MVL)

  10. Amino acid analysis.

    Science.gov (United States)

    Crabb, J W; West, K A; Dodson, W S; Hulmes, J D

    2001-05-01

    Amino acid analysis (AAA) is one of the best methods to quantify peptides and proteins. Two general approaches to quantitative AAA exist, namely, classical postcolumn derivatization following ion-exchange chromatography and precolumn derivatization followed by reversed-phase HPLC (RP-HPLC). Excellent instrumentation and several specific methodologies are available for both approaches, and both have advantages and disadvantages. This unit focuses on picomole-level AAA of peptides and proteins using the most popular precolumn-derivatization method, namely, phenylthiocarbamyl amino acid analysis (PTC-AAA). It is directed primarily toward those interested in establishing the technology with a modest budget. PTC derivatization and analysis conditions are described, and support and alternate protocols describe additional techniques necessary or useful for most any AAA method--e.g., sample preparation, hydrolysis, instrument calibration, data interpretation, and analysis of difficult or unusual residues such as cysteine, tryptophan, phosphoamino acids, and hydroxyproline. PMID:18429107

  11. Longitudinal categorical data analysis

    CERN Document Server

    Sutradhar, Brajendra C

    2014-01-01

    This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...

  12. Pathway analysis of IMC

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik

    2009-01-01

    We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it....

  13. Gabor Analysis for Imaging

    DEFF Research Database (Denmark)

    Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan

    2015-01-01

    In contrast to classical Fourier analysis, time–frequency analysis is concerned with localized Fourier transforms. Gabor analysis is an important branch of time–frequency analysis. Although significantly different, it shares with the wavelet transform methods the ability to describe the smoothness...... of a given function in a location-dependent way. The main tool is the sliding window Fourier transform or short-time Fourier transform (STFT) in the context of audio signals. It describes the correlation of a signal with the time–frequency shifted copies of a fixed function (or window or atom). Thus......, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...

  14. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  15. Analysis and logic

    CERN Document Server

    Henson, C Ward; Kechris, Alexander S; Odell, Edward; Finet, Catherine; Michaux, Christian; Cassels, J W S

    2003-01-01

    This volume comprises articles from four outstanding researchers who work at the cusp of analysis and logic. The emphasis is on active research topics; many results are presented that have not been published before and open problems are formulated.

  16. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  17. 13. seminar 'Activation analysis'

    International Nuclear Information System (INIS)

    Collection of the abstracts of contributions to the seminar covering broad ranges of application of activation analysis and improvements of systems and process steps. Most of them have been prepared separately for the energy data bases. (RB)

  18. Unsupervised Linear Discriminant Analysis

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An algorithm for unsupervised linear discriminant analysis was presented. Optimal unsupervised discriminant vectors are obtained through maximizing covariance of all samples and minimizing covariance of local k-nearest neighbor samples. The experimental results show our algorithm is effective.

  19. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  20. Perspectives in shape analysis

    CERN Document Server

    Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie

    2016-01-01

    This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...

  1. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  2. Analysis of marketing mix

    OpenAIRE

    Hartmanová, Dominika

    2013-01-01

    Bachelor Thesis Analysis of the marketing mix describes a marketing mix of company Lego Tradings, s. r. o. The theoretical part includes specification of basic concepts, such as marketing, marketing mix, tools of marketing mix, product, price, place and promotion. The second part is devoted to custom solutions. The introducion of the Lego company comes first. There are also analysis of the tools of marketing mix. In this part the results are described for a marketing research, namely a quest...

  3. Theoretical numerical analysis

    CERN Document Server

    Wendroff, Burton

    2014-01-01

    Theoretical Numerical Analysis focuses on the presentation of numerical analysis as a legitimate branch of mathematics. The publication first elaborates on interpolation and quadrature and approximation. Discussions focus on the degree of approximation by polynomials, Chebyshev approximation, orthogonal polynomials and Gaussian quadrature, approximation by interpolation, nonanalytic interpolation and associated quadrature, and Hermite interpolation. The text then ponders on ordinary differential equations and solutions of equations. Topics include iterative methods for nonlinear systems, matri

  4. Analysis of educational blogs.

    OpenAIRE

    Christenová, Jindřiška

    2014-01-01

    Abstract The bachelor's thesis Analysis of educational blogs.is focused on theme content analizing of educational blogs, which has been written by teachers from czech republic in czech language. Further aim of thesis was to finding and description of sructure of themes and their attributes. In the reaserch was used method of theme content analysis with elemnts of inductive anylisis method. Main research sample was fifteen post from ten educational blogs written by teachers and educationali...

  5. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  6. Economic Analysis of Constitutions

    OpenAIRE

    Roger B. Myerson

    2000-01-01

    This paper is a preliminary draft of an article to appear in Chicago Law Review (2000), as part of a symposium reviewing two new books on economic analysis of constitutions: Dennis Mueller's Constitutional Democracy and Robert Cooter's Strategic Constitution. Some of the basic questions of constitutional analysis are introduced, and the importance of work in this area is shown as one of the major new developments in social theory. The methods of economic theory are then shown to be particular...

  7. MEAD retrospective analysis report

    OpenAIRE

    Hasager, Charlotte Bay; CARSTENSEN J.; Frohn, L. M.; Gustafson, B.; Brandt, J.; Conley, D.; Geernaert, G.; Henriksen, P.; C. A. Skjøth; Johnsson, M.

    2003-01-01

    The retrospective analysis investigates links between atmospheric nitrogen deposition and algal bloom development in the Kattegat Sea from April to September 1989-1999. The analysis is based on atmospheric deposition model results from the ACDEP model,hydrodynamic deep-water flux results, phytoplankton abundance observations from Danish and Swedish marine monitoring stations and optical satellite data. Approximately 70 % of the atmospheric deposition consists of wet depostion of highly episod...

  8. Bayesian Group Factor Analysis

    OpenAIRE

    Virtanen, Seppo; Klami, Arto; Khan, Suleiman A; Kaski, Samuel

    2011-01-01

    We introduce a factor analysis model that summarizes the dependencies between observed variable groups, instead of dependencies between individual variables as standard factor analysis does. A group may correspond to one view of the same set of objects, one of many data sets tied by co-occurrence, or a set of alternative variables collected from statistics tables to measure one property of interest. We show that by assuming group-wise sparse factors, active in a subset of the sets, the variat...

  9. Group Factor Analysis

    OpenAIRE

    Klami, Arto; Virtanen, Seppo; Leppäaho, Eemeli; Kaski, Samuel

    2014-01-01

    Factor analysis provides linear factors that describe relationships between individual variables of a data set. We extend this classical formulation into linear factors that describe relationships between groups of variables, where each group represents either a set of related variables or a data set. The model also naturally extends canonical correlation analysis to more than two sets, in a way that is more flexible than previous extensions. Our solution is formulated as variational inferenc...

  10. Hedging in Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    XIAO Xin

    2015-01-01

    In this article an attempt to enhance the awareness of hedging use in discourse analysis and academic writing is made by analyzing hedges employed in two comparable texts. The discourse analysis is conducted from“content-oriented”hedges and“reader-oriented”hedges. The article suggests that hedging can dampen utterances and statements, weaken the force of what one says and show politeness to the listeners or readers, which varies from different discourse styles of various genres.

  11. Sleep EEG analysis

    OpenAIRE

    Vávrová, Eva

    2014-01-01

    This thesis deals with the analysis of EEG during various sleep stages, which is done by calculating the selected parameters from the time and frequency domain. These parameters are calculated from individual segments of EEG signals that correspond with various sleep stages. Based on the analysis it decides which EEG parameters are appropriate for the automatic detection of the phases and which method is more suitable for evaluation of data in hypnogram. The programme MATLAB was used for the ...

  12. DART system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  13. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  14. Submarine hydraulic control analysis

    OpenAIRE

    Bower, Michael J.

    1980-01-01

    Approved for public release; distribution unlimited A mathematical model was developed to include line effects in the submarine hydraulic system dynamic performance analysis. The project was undertaken in an effort to demonstrate the necessity of coupling the entire hydraulic power network for an accurate analysis of any of the subsystems rather than the current practice of treating a component loop as an isolated system. It was intended that the line model could be co...

  15. Fractal Symbolic Analysis

    OpenAIRE

    Mateev, Nikolay; Menon, Vijay; Pingali, Keshav

    2000-01-01

    Restructuring compilers use dependence analysis to prove that the meaning of a program is not changed by a transformation. A well-known limitation of dependence analysis is that it examines only the memory locations read and written by a statement, and does not assume any particular interpretation for the operations in that statement. Exploiting the semantics of these operations enables a wider set of transformations to be used, and is critical for optimizing important codes such as LU factor...

  16. Cuckoo malware analysis

    CERN Document Server

    Oktavianto, Digit

    2013-01-01

    This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.

  17. Zen and Behavior Analysis

    OpenAIRE

    Bass, Roger

    2010-01-01

    Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless—a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to Enlightenment and Samādhi. The concept of stimulus singularity is introduced to account for why, within Zen's frame of reference, its methods can be stu...

  18. Temperature reconstruction analysis

    CERN Document Server

    Scafetta, N; Grigolini, P; Roberts, J; Scafetta, Nicola; Imholt, Tim; Grigolini, Paolo; Roberts, Jim

    2002-01-01

    This paper presents a wavelet multiresolution analysis of a time series dataset to study the correlation between the real temperature data and three temperature model reconstructions at different scales. We show that the Mann et.al. model reconstructs the temperature better at all temporal resolutions. We show and discuss the wavelet multiresolution analysis of the Mann's temperature reconstruction for the period from 1400 to 2000 A.D.E.

  19. CMS analysis operations

    International Nuclear Information System (INIS)

    During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.

  20. PEST Analysis of Serbia

    OpenAIRE

    Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic

    2012-01-01

    The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...

  1. Provenance as Dependency Analysis

    OpenAIRE

    Cheney, James; Ahmed, Amal; Acar, Umut,

    2007-01-01

    Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intende...

  2. Activation analysis in forensic studies

    International Nuclear Information System (INIS)

    Application of neutron activation analysis in forensics are grouped into 3 categories: firearms-discharge applications, elemental analysis of other nonbiological evidence materials (paint, other), and elemental analysis of biological evidence materials (multielemental analysis of hair, analysis of hair for As and Hg). 18 refs

  3. Software reliability analysis in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Probabilistic Risk Analysis (PRA) is a tool which can reveal shortcomings of the NPP design in general. PRA analysts have not had sufficient guiding principles in modelling particular digital components malfunctions. Digital I and C systems are mostly analysed simply and the software reliability estimates are engineering judgments often lacking a proper justification. The OECD/NEA Working Group RISK's task DIGREL develops a taxonomy of failure modes of digital I and C systems. The EU FP7 project HARMONICS develops software reliability estimation method based on an analytic approach and Bayesian belief network. (author)

  4. Activation analysis in Greece

    International Nuclear Information System (INIS)

    A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples

  5. Distributed analysis in ATLAS

    Science.gov (United States)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  6. Distributed analysis in ATLAS

    CERN Document Server

    Legger, Federica; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  7. A PROOF Analysis Framework

    CERN Document Server

    Gonzalez Caballero, Isidro

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of ...

  8. Failure Analysis for Improved Reliability

    Science.gov (United States)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  9. Neutron multiplicity analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Scott L [Los Alamos National Laboratory

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  10. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  11. Harmonic and geometric analysis

    CERN Document Server

    Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao

    2015-01-01

    This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights.  The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...

  12. Forensic Activation Analysis

    International Nuclear Information System (INIS)

    The high sensitivity of high-flux (reactor) thermal-neutron activation analysis (NAA) for the detection and quantitative measurement of a large number of elements has led, in recent years, to a considerable degree of application of the method in the area of scientific crime investigation (criminalistics). Thus, in a Forensic Activation Analysis Bibliography recently compiled by the author, some 135 publications in this field are listed - and more are appearing quite rapidly. The nondestructive character of the purely-instrumental form of the method is an added advantage in forensic work, since evidence samples involved in actual criminal cases are not destroyed during analysis, but are preserved intact for possible presentation in court. Quite aside from, or in addition to, use in court, NAA results can be very helpful in the investigative stage of particular criminal cases. The ultra sensitivity of the method often enables one to analyze evidence specimens that are too tiny for meaningful analysis by more conventional elemental analysis methods. Also, this high sensitivity often enables one to characterize, or individualize, evidence specimens as to the possibility of common origin - via the principle of multi-element trace-constituent characterization

  13. Pitch Analysis of Ukulele

    Directory of Open Access Journals (Sweden)

    Suphattharachai Chomphan

    2012-01-01

    Full Text Available Problem statement: The ukulele is a trendy instrument in the present day. It is a member of the guitar family of instruments which employs four nylon or gut strings or four courses of strings. However, a statistical analysis of the pitch of this instrument has not been conducted. To analysis pitch or fundamental frequency of its main cords should be performed in an appropriate way. This study brings about its effective sound synthesis which is an important issue in the future. Approach: An efficient technique for the analysis of the fundamental frequency (F0 of the human speech had been applied to the analysis of main cords of the ukulele. The autocorrelation-based technique was used with the signal waveform to extract the optimal period or pitch for the corresponding analyzed frame in time domain. Then the corresponding fundamental frequency was calculated in the frequency domain. Results: The 21 main cords were chosen in the study. It had been seen that the existing fundamental frequency values were varied from one to three values. The value was ranging from 65.42 Hz-329.93 Hz. Conclusion: By using the analysis technique of fundamental frequency of the human speech, the output frequencies of all main cords can be extracted. It can be empirically seen that they have their unique values from each others."

  14. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  15. Mathematical analysis II

    CERN Document Server

    Canuto, Claudio

    2015-01-01

    The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...

  16. The data analysis handbook

    CERN Document Server

    Frank, IE

    1994-01-01

    Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...

  17. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  18. Principles of harmonic analysis

    CERN Document Server

    Deitmar, Anton

    2014-01-01

    This book offers a complete and streamlined treatment of the central principles of abelian harmonic analysis: Pontryagin duality, the Plancherel theorem and the Poisson summation formula, as well as their respective generalizations to non-abelian groups, including the Selberg trace formula. The principles are then applied to spectral analysis of Heisenberg manifolds and Riemann surfaces. This new edition contains a new chapter on p-adic and adelic groups, as well as a complementary section on direct and projective limits. Many of the supporting proofs have been revised and refined. The book is an excellent resource for graduate students who wish to learn and understand harmonic analysis and for researchers seeking to apply it.

  19. Mathematical analysis I

    CERN Document Server

    Zorich, Vladimir A

    2015-01-01

    VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...

  20. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  1. Real analysis on intervals

    CERN Document Server

    Choudary, A D R

    2014-01-01

    The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...

  2. Analysis of Waves

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...

  3. Proximate analysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Donahue, C.J.; Rais, E.A. [University of Michigan, Dearborn, MI (USA)

    2009-02-15

    This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter, fixed carbon, and ash content are determined for each sample and comparisons are made. Proximate analysis is performed on a coal sample from a local electric utility. From the weight percent sulfur found in the coal (determined by a separate procedure the Eschka method) and the ash content, students calculate the quantity of sulfur dioxide emissions and ash produced annually by a large coal-fired electric power plant.

  4. Real mathematical analysis

    CERN Document Server

    Pugh, Charles C

    2015-01-01

    Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...

  5. Pitfalls of exergy analysis

    CERN Document Server

    Vágner, Petr; Maršík, František

    2016-01-01

    The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.

  6. Residual Component Analysis

    CERN Document Server

    Kalaitzis, Alfredo A

    2011-01-01

    Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, Sigma = (sigma^2)*I. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.

  7. Exercises in analysis

    CERN Document Server

    Gasiński, Leszek

    2016-01-01

    This second of two Exercises in Analysis volumes covers problems in five core topics of mathematical analysis: Function Spaces, Nonlinear and Multivalued Maps, Smooth and Nonsmooth Calculus, Degree Theory and Fixed Point Theory, and Variational and Topological Methods. Each of five topics corresponds to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. Exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of the ...

  8. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  9. Neutron signal transfer analysis

    CERN Document Server

    Pleinert, H; Lehmann, E

    1999-01-01

    A new method called neutron signal transfer analysis has been developed for quantitative determination of hydrogenous distributions from neutron radiographic measurements. The technique is based on a model which describes the detector signal obtained in the measurement as a result of the action of three different mechanisms expressed by signal transfer functions. The explicit forms of the signal transfer functions are determined by Monte Carlo computer simulations and contain only the distribution as a variable. Therefore an unknown distribution can be determined from the detector signal by recursive iteration. This technique provides a simple and efficient tool for analysis of this type while also taking into account complex effects due to the energy dependency of neutron interaction and single and multiple scattering. Therefore this method provides an efficient tool for precise quantitative analysis using neutron radiography, as for example quantitative determination of moisture distributions in porous buil...

  10. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  11. Foundations of VISAR analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H.

    2006-06-01

    The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.

  12. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  13. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  14. Fuzzy data analysis

    CERN Document Server

    Bandemer, Hans

    1992-01-01

    Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.

  15. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical tec

  16. UCF WP TIPOVER ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Z. Ceylan

    1998-04-28

    The purpose of this analysis is to determine the structural response of the 21 pressurized water reactor (PWR) uncanistered fuel (UCF) waste package (WP) to a tipover design basis event (DBE) dynamic load; the results will be reported in terms of stress magnitudes. Finite-element solution was performed by making use of the commercially available ANSYS finite-element code. A finite-element model of the waste package was developed and analyzed for a tipover DBE dynamic load. The results of this analysis were provided in tables and were also plotted in terms of the maximum stress contours to determine their locations.

  17. Introduction to real analysis

    CERN Document Server

    Schramm, Michael J

    2008-01-01

    This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl

  18. Sensitivity analysis of SPURR

    Energy Technology Data Exchange (ETDEWEB)

    Witholder, R.E.

    1980-04-01

    The Solar Energy Research Institute has conducted a limited sensitivity analysis on a System for Projecting the Utilization of Renewable Resources (SPURR). The study utilized the Domestic Policy Review scenario for SPURR agricultural and industrial process heat and utility market sectors. This sensitivity analysis determines whether variations in solar system capital cost, operation and maintenance cost, and fuel cost (biomass only) correlate with intuitive expectations. The results of this effort contribute to a much larger issue: validation of SPURR. Such a study has practical applications for engineering improvements in solar technologies and is useful as a planning tool in the R and D allocation process.

  19. Harmonic analysis and applications

    CERN Document Server

    Heil, Christopher

    2007-01-01

    This self-contained volume in honor of John J. Benedetto covers a wide range of topics in harmonic analysis and related areas. These include weighted-norm inequalities, frame theory, wavelet theory, time-frequency analysis, and sampling theory. The chapters are clustered by topic to provide authoritative expositions that will be of lasting interest. The original papers collected are written by prominent researchers and professionals in the field. The book pays tribute to John J. Benedetto's achievements and expresses an appreciation for the mathematical and personal inspiration he has given to

  20. Analysis of maintenance strategies

    International Nuclear Information System (INIS)

    The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view

  1. Concise vector analysis

    CERN Document Server

    Eliezer, C J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Concise Vector Analysis is a five-chapter introductory account of the methods and techniques of vector analysis. These methods are indispensable tools in mathematics, physics, and engineering. The book is based on lectures given by the author in the University of Ceylon.The first two chapters deal with vector algebra. These chapters particularly present the addition, representation, and resolution of vectors. The next two chapters examine the various aspects and specificities of vector calculus. The last chapter looks into some standard applications of vector algebra and calculus.This book wil

  2. Introduction to complex analysis

    CERN Document Server

    Priestley, H A

    2003-01-01

    Complex analysis is a classic and central area of mathematics, which is studied and exploited in a range of important fields, from number theory to engineering. Introduction to Complex Analysis was first published in 1985, and for this much awaited second edition the text has been considerably expanded, while retaining the style of the original. More detailed presentation is given of elementary topics, to reflect the knowledge base of current students. Exercise sets have beensubstantially revised and enlarged, with carefully graded exercises at the end of each chapter.This is the latest additi

  3. Electronic circuit analysis

    CERN Document Server

    Kishore, K Lal

    2008-01-01

    Second Edition of the book Electronic Circuit Analysis is brought out with certain new Topics and reorganization of text matter into eight units. With addition of new topics, syllabi of many universities in this subject can be covered. Besides this, the book can also meet the requirements of M.Sc (Electronics), AMIETE, AMIE (Electronics) courses. Text matter is improved thoroughly. New topics like frequency effects in multistage amplifiers, amplifier circuit analysis, design of high frequency amplifiers, switching regulators, voltage multipliers, Uninterrupted Power Supplies (UPS), and Switchi

  4. Environmental analysis support

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.L.

    1996-06-01

    Activities in environmental analysis support included assistance to the Morgantown and Pittsburgh Energy Technology Centers (METC and PETC) in reviewing and preparing documents required by the National Environmental Policy Act (NEPA) for projects selected for the Clean Coal Technology (CCT) Program. An important activity was the preparation for METC of a final Environmental Assessment (EA) for the proposed Externally Fired Combined Cycle (EFCC) Project in Warren, Pennsylvania. In addition, a post-project environmental analysis was prepared for PETC to evaluate the Demonstration of Advanced Combustion Techniques for a Tangentially-Fired Boiler in Lynn Haven, Florida.

  5. Provenance as Dependency Analysis

    CERN Document Server

    Cheney, James; Acar, Umut

    2007-01-01

    Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intended to show how (part of) the output of a query depends on (parts of) its input. We introduce a semantic characterization of such dependency provenance, show that this form of provenance is not computable, and provide dynamic and static approximation techniques.

  6. Strictness and Totality Analysis

    DEFF Research Database (Denmark)

    Solberg, K. L.; Nielson, Hanne Riis; Nielson, Flemming

    1998-01-01

    We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those...... terms that definitely do not denote bottom (i.e. do evaluate to WHNF). The analysis is presented as an annotated type system allowing conjunctions at ?top-level? only. We give examples of its use and prove the correctness with respect to a natural-style operational semantics....

  7. Towards Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...... semantics, not only in text, but also in dynamic text (chat), images, and combinations of text and images. Here we further expand on the relevance of the ICA model for representing context, including two new analyzes of abstract data: social networks and musical features....

  8. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  9. Analysis in Euclidean space

    CERN Document Server

    Hoffman, Kenneth

    2007-01-01

    Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq

  10. Fourier Analysis on Groups

    CERN Document Server

    Rudin, Walter

    2011-01-01

    In the late 1950s, many of the more refined aspects of Fourier analysis were transferred from their original settings (the unit circle, the integers, the real line) to arbitrary locally compact abelian (LCA) groups. Rudin's book, published in 1962, was the first to give a systematic account of these developments and has come to be regarded as a classic in the field. The basic facts concerning Fourier analysis and the structure of LCA groups are proved in the opening chapters, in order to make the treatment relatively self-contained.

  11. Associative Analysis in Statistics

    Directory of Open Access Journals (Sweden)

    Mihaela Muntean

    2015-03-01

    Full Text Available In the last years, the interest in technologies such as in-memory analytics and associative search has increased. This paper explores how you can use in-memory analytics and an associative model in statistics. The word “associative” puts the emphasis on understanding how datasets relate to one another. The paper presents the main characteristics of “associative” data model. Also, the paper presents how to design an associative model for labor market indicators analysis. The source is the EU Labor Force Survey. Also, this paper presents how to make associative analysis.

  12. Data analysis using SAS

    CERN Document Server

    Peng, Chao-Ying Joanne

    2008-01-01

    "Peng provides an excellent overview of data analysis using the powerful statistical software package SAS. This book is quite appropriate as a self-placed tutorial for researchers, as well as a textbook or supplemental workbook for data analysis courses such as statistics or research methods. Peng provides detailed coverage of SAS capabilities using step-by-step procedures and includes numerous comprehensive graphics and figures, as well as SAS printouts. Readers do not need a background in computer science or programming. Includes numerous examples in education, health sciences, and business.

  13. Similar component analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hong; WANG Xin; LI Junwei; CAO Xianguang

    2006-01-01

    A new unsupervised feature extraction method called similar component analysis (SCA) is proposed in this paper. SCA method has a self-aggregation property that the data objects will move towards each other to form clusters through SCA theoretically,which can reveal the inherent pattern of similarity hidden in the dataset. The inputs of SCA are just the pairwise similarities of the dataset,which makes it easier for time series analysis due to the variable length of the time series. Our experimental results on many problems have verified the effectiveness of SCA on some engineering application.

  14. Gait Analysis Laboratory

    Science.gov (United States)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  15. Spacecraft Radiation Analysis

    Science.gov (United States)

    Harris, D. W.

    1972-01-01

    The radiation interface in spacecrafts using radioisotope thermoelectric generators is studied. A Monte Carlo analysis of the radiation field that includes scattered radiation effects, produced neutron and gamma photon isoflux contours as functions of distance from the RTG center line. It is shown that the photon flux is significantly depressed in the RTG axial direction because of selfshielding. Total flux values are determined by converting the uncollided flux values into an equivalent RTG surface source and then performing a Monte Carlo analysis for each specific dose point. Energy distributions of the particle spectra completely define the radiation interface for a spacecraft model.

  16. Advanced Economic Analysis

    Science.gov (United States)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  17. Inverse correspondence analysis

    NARCIS (Netherlands)

    Groenen, PJF; van de Velden, M

    2004-01-01

    In correspondence analysis (CA), rows and columns of a data matrix are depicted as points in low-dimensional space. The row and column profiles are approximated by minimizing the so-called weighted chi-squared distance between the original profiles and their approximations, see for example, [Theory

  18. Retrospective landscape analysis

    DEFF Research Database (Denmark)

    Fritzbøger, Bo

    2011-01-01

    On the basis of maps from the 18th and 19th centuries, a retrospective analysis was carried out of documentary settlement and landscape data extending back to the Middle Ages with the intention of identifying and dating general structural and dynamic features of the cultural landscape in a selected...

  19. Haskell data analysis cookbook

    CERN Document Server

    Shukla, Nishant

    2014-01-01

    Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.

  20. Introductory real analysis

    CERN Document Server

    Kolmogorov, A N; Silverman, Richard A

    1975-01-01

    Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.

  1. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  2. Writing proofs in analysis

    CERN Document Server

    Kane, Jonathan M

    2016-01-01

    This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...

  3. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  4. Manual for subject analysis

    International Nuclear Information System (INIS)

    This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)

  5. CMS analysis school model

    International Nuclear Information System (INIS)

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  6. Senegal : Country Environmental Analysis

    OpenAIRE

    World Bank

    2008-01-01

    The main objective of the Senegal Country Environmental Analysis (CEA) is to reinforce the ongoing dialogue on environmental issues between the World Bank and the Government of Senegal. The CEA also aims to support the ongoing Government implementation of a strategic results-based planning process at the Environment Ministry (MEPNBRLA). The main goal is to enable Senegal to have the necess...

  7. Thermal Analysis of Plastics

    Science.gov (United States)

    D'Amico, Teresa; Donahue, Craig J.; Rais, Elizabeth A.

    2008-01-01

    This lab experiment illustrates the use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) in the measurement of polymer properties. A total of seven exercises are described. These are dry exercises: students interpret previously recorded scans. They do not perform the experiments. DSC was used to determine the…

  8. Naive Analysis of Variance

    Science.gov (United States)

    Braun, W. John

    2012-01-01

    The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…

  9. Elementary functional analysis

    CERN Document Server

    Shilov, Georgi E

    1996-01-01

    Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.

  10. Introductory complex analysis

    CERN Document Server

    Silverman, Richard A

    1984-01-01

    A shorter version of A. I. Markushevich's masterly three-volume Theory of Functions of a Complex Variable, this edition is appropriate for advanced undergraduate and graduate courses in complex analysis. Numerous worked-out examples and more than 300 problems, some with hints and answers, make it suitable for independent study. 1967 edition.

  11. Operando (micro) XAFS analysis

    OpenAIRE

    Arčon, Iztok; Dominko, Robert; Vogel-Mikuš, Katarina

    2016-01-01

    In the talk the principles of XAS methods were presented with practical examples which illustrate the possibilities and advanced approaches for their use in structural analysis of different types of materials. The emphasis will be on to the use of XAS spectroscopy in operando mode and in combination with X-ray microscopy.

  12. Digital Systems Analysis

    Science.gov (United States)

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  13. Multiphasic analysis of growth.

    NARCIS (Netherlands)

    Koops, W.J.

    1989-01-01

    The central theme of this thesis is the mathematical analysis of growth in animals, based on the theory of multiphasic growth. Growth in biological terms is related to increase in size and shape. This increase is determined by internal (genetical) and external (environmental) factors. Well known mat

  14. Russian Language Analysis Project

    Science.gov (United States)

    Serianni, Barbara; Rethwisch, Carolyn

    2011-01-01

    This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…

  15. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  16. Safeguards system analysis, (1)

    International Nuclear Information System (INIS)

    A system analysis on the implementing safeguards system based on the traditional materials accountancy was done. This report describes about verification methods applied to operator's measurement data, MUF evaluation method, theories on the decision of PIT frequency and designing of inspection plan. (author)

  17. Grey component analysis

    NARCIS (Netherlands)

    Westerhuis, J.A.; Derks, E.P.P.A.; Hoefsloot, H.C.J.; Smilde, A.K.

    2007-01-01

    The interpretation of principal component analysis (PCA) models of complex biological or chemical data can be cumbersome because in PCA the decomposition is performed without any knowledge of the system at hand. Prior information of the system is not used to improve the interpretation. In this paper

  18. Advanced biomedical image analysis

    CERN Document Server

    Haidekker, Mark A

    2010-01-01

    "This book covers the four major areas of image processing: Image enhancement and restoration, image segmentation, image quantification and classification, and image visualization. Image registration, storage, and compression are also covered. The text focuses on recently developed image processing and analysis operators and covers topical research"--Provided by publisher.

  19. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  20. Developing Word Analysis Skills.

    Science.gov (United States)

    Heilman, Arthur W.

    The importance of word analysis skills to reading ability is discussed, and methodologies for teaching such skills are examined. It is stated that a child cannot become proficient in reading if he does not master the skill of associating printed letter symbols with the sounds they represent. Instructional procedures which augment the alphabet with…

  1. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid;

    2016-01-01

    automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...

  2. Learning Haskell data analysis

    CERN Document Server

    Church, James

    2015-01-01

    If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.

  3. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  4. Structural analysis of DAEs

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin

    2002-01-01

    , by the implementation of the Simpy tool box. This is an object oriented system implemented in the Python language. It can be used for analysis of DAEs, ODEs and non-linear equation and uses e.g. symbolic representations of expressions and equations. The presentations of theory and algorithms for structural index...

  5. Shifted Independent Component Analysis

    DEFF Research Database (Denmark)

    Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai

    2007-01-01

    Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...

  6. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.

  7. Environmental risk analysis

    International Nuclear Information System (INIS)

    The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)

  8. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains. PMID:26789381

  9. An Analysis of Anaphora

    Institute of Scientific and Technical Information of China (English)

    于昌利

    2008-01-01

    This paper is based on Chinese and English examples,presenting a systematic research and analysis on zero anaphora,pronominal anaphora and NP anaphora. It is found that semantic features,pragmatic elements.contexts and syntactic structures play important roles in our choice and interpreting them.

  10. Online Paper Review Analysis

    Directory of Open Access Journals (Sweden)

    Doaa Mohey El-Din

    2015-09-01

    Full Text Available Sentiment analysis or opinion mining is used to automate the detection of subjective information such as opinions, attitudes, emotions, and feelings. Hundreds of thousands care about scientific research and take a long time to select suitable papers for their research. Online reviews on papers are the essential source to help them. The reviews save reading time and save papers cost. This paper proposes a new technique to analyze online reviews. It is called sentiment analysis of online papers (SAOOP. SAOOP is a new technique used for enhancing bag-of-words model, improving the accuracy and performance. SAOOP is useful in increasing the understanding rate of review's sentences through higher language coverage cases. SAOOP introduces solutions for some sentiment analysis challenges and uses them to achieve higher accuracy. This paper also presents a measure of topic domain attributes, which provides a ranking of total judging on each text review for assessing and comparing results across different sentiment techniques for a given text review. Finally, showing the efficiency of the proposed approach by comparing the proposed technique with two sentiment analysis techniques. The comparison terms are based on measuring accuracy, performance and understanding rate of sentences.

  11. Analysis of Progestagens

    Science.gov (United States)

    Wood, P. J.; Gower, D. B.

    This chapter covers the analysis of steroids with progesterone-like activity, classified as “progestagens”. Steroids in this group include the naturally occurring C21 steroids, progesterone (4-pregnene-3,20-dione) and its metabolites, together with synthetic steroids, such as norgestrel norethisterone (NE), and medroxyprogesterone acetate which also have progestational activity.

  12. Anisotropic generalized Procrustes analysis

    NARCIS (Netherlands)

    Bennani Dosse, Mohammed; Kiers, Henk A.L.; Ten Berge, Jos M.F.

    2011-01-01

    Generalized Procrustes analysis is a popular method for matching several configurations by translations, rotations/reflections and scaling constants. It aims at producing a group average from these Euclidean similarity transformations followed by bi-linear approximation of this group average for gra

  13. Social network analysis

    NARCIS (Netherlands)

    W. de Nooy

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the si

  14. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  15. Transactional Analysis in Management.

    Science.gov (United States)

    Hewson, Julie; Turner, Colin

    Although Transactional Analysis (TA) has heavily influenced psychotherapy, little has been written to parallel that influence in areas of organization theory, organization behavior, or management studies. This book is intended primarily for people working in management roles. In part one, personal experiences are drawn upon to describe a fictional…

  16. Statistical Analysis Plan

    DEFF Research Database (Denmark)

    Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi;

    2015-01-01

    This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....

  17. Public Expenditure Analysis

    OpenAIRE

    Shah, Anwar

    2005-01-01

    This book provides tools of analysis for discovering equity in tax burdens as well as in public spending and judging government performance in its role in safeguarding the interests of the poor and those otherwise disadvantaged members of society, such as women, children, and minorities. The book further provides a framework for a rights-based approach to citizen empowerment-in other words, ...

  18. Python data analysis

    CERN Document Server

    Idris, Ivan

    2014-01-01

    This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.

  19. Doxing: a conceptual analysis

    NARCIS (Netherlands)

    Douglas, David M.

    2016-01-01

    Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d

  20. Head space analysis

    NARCIS (Netherlands)

    Stekelenburg, G.J. van; Koorevaar, G.

    1971-01-01

    Additional analytical information is given about the method of head space analysis. From the data presented it can be concluded that this technique may be advantageous for enzyme kinetic studies in turbid solutions, provided a volatile organic substance is involved in the chemical reaction. Also som

  1. Safety analysis for `Fugen`

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ``Fugen`` was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ``Fugen`` has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)

  2. Multilevel component analysis

    NARCIS (Netherlands)

    Timmerman, M.E.

    2006-01-01

    A general framework for the exploratory component analysis of multilevel data (MLCA) is proposed. In this framework, a separate component model is specified for each group of objects at a certain level. The similarities between the groups of objects at a given level can be expressed by imposing cons

  3. Proteoglycan isolation and analysis

    DEFF Research Database (Denmark)

    Woods, A; Couchman, J R

    2001-01-01

    Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...

  4. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  5. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples...... of recent work in the critical analysis of multimodal discourse....

  6. Introduction to Food Analysis

    Science.gov (United States)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet

  7. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  8. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  9. Ceramic tubesheet design analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mallett, R.H.; Swindeman, R.W.

    1996-06-01

    A transport combustor is being commissioned at the Southern Services facility in Wilsonville, Alabama to provide a gaseous product for the assessment of hot-gas filtering systems. One of the barrier filters incorporates a ceramic tubesheet to support candle filters. The ceramic tubesheet, designed and manufactured by Industrial Filter and Pump Manufacturing Company (EF&PM), is unique and offers distinct advantages over metallic systems in terms of density, resistance to corrosion, and resistance to creep at operating temperatures above 815{degrees}C (1500{degrees}F). Nevertheless, the operational requirements of the ceramic tubesheet are severe. The tubesheet is almost 1.5 m in (55 in.) in diameter, has many penetrations, and must support the weight of the ceramic filters, coal ash accumulation, and a pressure drop (one atmosphere). Further, thermal stresses related to steady state and transient conditions will occur. To gain a better understanding of the structural performance limitations, a contract was placed with Mallett Technology, Inc. to perform a thermal and structural analysis of the tubesheet design. The design analysis specification and a preliminary design analysis were completed in the early part of 1995. The analyses indicated that modifications to the design were necessary to reduce thermal stress, and it was necessary to complete the redesign before the final thermal/mechanical analysis could be undertaken. The preliminary analysis identified the need to confirm that the physical and mechanical properties data used in the design were representative of the material in the tubesheet. Subsequently, few exploratory tests were performed at ORNL to evaluate the ceramic structural material.

  10. Analysis framework for GLORIA

    Science.gov (United States)

    Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian

    2012-05-01

    GLORIA stands for "GLObal Robotic-telescopes Intelligent Array". GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  11. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  12. Strategic analysis of Czech Airlines

    OpenAIRE

    Moiseeva, Polina

    2016-01-01

    The thesis called Strategic Analysis of Czech Airlines which completely analyses current situation within the company. It presents theoretical base for such an analysis and subsequently offers situational analysis, which includes the analysis of external environment, internal environment and suggestions for improvement. The thesis includes a complete companys SWOT analysis and offers the applying of Porters five forces framework. The thesis also includes recommendations and suggestions for th...

  13. Analysis of Muji's Business Strategy

    Institute of Scientific and Technical Information of China (English)

    范晶

    2011-01-01

    This article is a report of analysis of Muji's business strategy.First,the vision and mission are introduced.Second,the current strategy is identified.Then the industry analysis,industry driving forces,key success factors,value chain analysis,competitive advantage,competitive power of the competitive advantage were analyzed.At last,on the basis of the former analysis the SWOT analysis was worked out.

  14. Trends in BWR transient analysis

    International Nuclear Information System (INIS)

    While boiling water reactor (BWR) analysis methods for transient and loss of coolant accident analysis are well established, refinements and improvements continue to be made. This evolution of BWR analysis methods is driven by the new applications. This paper discusses some examples of these trends, specifically, time domain stability analysis and analysis of the simplified BWR (SBWR), General Electric's design approach involving a shift from active to passive safety systems and the elimination/simplification of systems for improved operation and maintenance

  15. Exploratory data analysis with Matlab

    CERN Document Server

    Martinez, Wendy L; Solka, Jeffrey

    2010-01-01

    Since the publication of the bestselling first edition, many advances have been made in exploratory data analysis (EDA). Covering innovative approaches for dimensionality reduction, clustering, and visualization, Exploratory Data Analysis with MATLAB®, Second Edition uses numerous examples and applications to show how the methods are used in practice.New to the Second EditionDiscussions of nonnegative matrix factorization, linear discriminant analysis, curvilinear component analysis, independent component analysis, and smoothing splinesAn expanded set of methods for estimating the intrinsic di

  16. Workbook on data analysis

    International Nuclear Information System (INIS)

    As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)

  17. Waveform analysis of sound

    CERN Document Server

    Tohyama, Mikio

    2015-01-01

    What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...

  18. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...

  19. Intracochlear microprobe analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bone, R.C.; Ryan, A.F.

    1982-04-01

    Energy dispersive x-ray analysis (EDXA) or microprobe analysis provides cochlear physiologists with a means of accurately assessing relative ionic concentrations in selected portions of the auditory mechanism. Rapid freezing followed by lyophilization allows the recovery of fluid samples in crystalline form not only from perilymphatic and endolymphatic spaces, but also from much smaller subregions of the cochlea. Because samples are examined in a solid state, there is no risk of diffusion into surrounding or juxtaposed fluids. Samples of cochlear tissues may also be evaluated without the danger of intercellular ionic diffusion. During direct visualization by scanning electron microscopy, determination of the biochemical makeup of the material being examined can be simultaneously, assuring the source of the data collected. Other potential advantages and disadvantages of EDXA are reviewed. Initial findings as they relate to endolymph, perilymph, stria vascularis, and the undersurface of the tectorial membrane are presented.

  20. Exascale Data Analysis

    CERN Document Server

    CERN. Geneva; Fitch, Blake

    2011-01-01

    Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...