Generalized estimating equations
Hardin, James W
2002-01-01
Although powerful and flexible, the method of generalized linear models (GLM) is limited in its ability to accurately deal with longitudinal and clustered data. Developed specifically to accommodate these data types, the method of Generalized Estimating Equations (GEE) extends the GLM algorithm to accommodate the correlated data encountered in health research, social science, biology, and other related fields.Generalized Estimating Equations provides the first complete treatment of GEE methodology in all of its variations. After introducing the subject and reviewing GLM, the authors examine th
Variance estimation for generalized Cavalieri estimators
Johanna Ziegel; Eva B. Vedel Jensen; Karl-Anton Dorph-Petersen
2011-01-01
The precision of stereological estimators based on systematic sampling is of great practical importance. This paper presents methods of data-based variance estimation for generalized Cavalieri estimators where errors in sampling positions may occur. Variance estimators are derived under perturbed systematic sampling, systematic sampling with cumulative errors and systematic sampling with random dropouts. Copyright 2011, Oxford University Press.
Project Cost Estimation for Planning
2010-02-26
For Nevada Department of Transportation (NDOT), there are far too many projects that ultimately cost much more than initially planned. Because project nominations are linked to estimates of future funding and the analysis of system needs, the inaccur...
Generalized Centroid Estimators in Bioinformatics
Hamada, Michiaki; Kiryu, Hisanori; Iwasaki, Wataru; Asai, Kiyoshi
2011-01-01
In a number of estimation problems in bioinformatics, accuracy measures of the target problem are usually given, and it is important to design estimators that are suitable to those accuracy measures. However, there is often a discrepancy between an employed estimator and a given accuracy measure of the problem. In this study, we introduce a general class of efficient estimators for estimation problems on high-dimensional binary spaces, which represent many fundamental problems in bioinformatics. Theoretical analysis reveals that the proposed estimators generally fit with commonly-used accuracy measures (e.g. sensitivity, PPV, MCC and F-score) as well as it can be computed efficiently in many cases, and cover a wide range of problems in bioinformatics from the viewpoint of the principle of maximum expected accuracy (MEA). It is also shown that some important algorithms in bioinformatics can be interpreted in a unified manner. Not only the concept presented in this paper gives a useful framework to design MEA-based estimators but also it is highly extendable and sheds new light on many problems in bioinformatics. PMID:21365017
Project schedule and cost estimate report
International Nuclear Information System (INIS)
1988-03-01
All cost tables represent obligation dollars, at both a constant FY 1987 level and an estimated escalation level, and are based on the FY 1989 DOE Congressional Budget submittal of December 1987. The cost tables display the total UMTRA Project estimated costs, which include both Federal and state funding. The Total Estimated Cost (TEC) for the UMTRA Project is approximately $992.5 million (in 1987 escalated dollars). Project schedules have been developed that provide for Project completion by September 1994, subject to Congressional approval extending DOE's authorization under Public Law 95-604. The report contains site-specific demographic data, conceptual design assumptions, preliminary cost estimates, and site schedules. A general project overview is also presented, which includes a discussion of the basis for the schedule and cost estimates, contingency assumptions, work breakdown structure, and potential project risks. The schedules and cost estimates will be revised as necessary to reflect appropriate decisions relating to relocation of certain tailings piles, or other special design considerations or circumstances (such as revised EPA groundwater standards), and changes in the Project mission. 27 figs', 97 tabs
General risks for tunnelling projects: An overview
Siang, Lee Yong; Ghazali, Farid E. Mohamed; Zainun, Noor Yasmin; Ali, Roslinda
2017-10-01
Tunnels are indispensable when installing new infrastructure as well as when enhancing the quality of existing urban living due to their unique characteristics and potential applications. Over the past few decades, there has been a significant increase in the building of tunnels, world-wide. Tunnelling projects are complex endeavors, and risk assessment for tunnelling projects is likewise a complex process. Risk events are often interrelated. Occurrence of a technical risk usually carries cost and schedule consequences. Schedule risks typically impact cost escalation and project overhead. One must carefully consider the likelihood of a risk's occurrence and its impact in the context of a specific set of project conditions and circumstances. A project's goals, organization, and environment impacts in the context of a specific set of project conditions and circumstances. Some projects are primarily schedule driven; other projects are primarily cost or quality driven. Whether a specific risk event is perceived fundamentally as a cost risk or a schedule risk is governed by the project-specific context. Many researchers have pointed out the significance of recognition and control of the complexity, and risks of tunnelling projects. Although all general information on a project such as estimated duration, estimated cost, and stakeholders can be obtained, it is still quite difficult to accurately understand, predict and control the overall situation and development trends of the project, leading to the risks of tunnelling projects. This paper reviews all the key risks for tunnelling projects from several case studies that have been carried out by other researchers. These risks have been identified and reviewed in this paper. As a result, the current risk management plan in tunnelling projects can be enhanced by including all these reviewed risks as key information.
COST ESTIMATING RELATIONSHIPS IN ONSHORE DRILLING PROJECTS
Directory of Open Access Journals (Sweden)
Ricardo de Melo e Silva Accioly
2017-03-01
Full Text Available Cost estimating relationships (CERs are very important tools in the planning phases of an upstream project. CERs are, in general, multiple regression models developed to estimate the cost of a particular item or scope of a project. They are based in historical data that should pass through a normalization process before fitting a model. In the early phases they are the primary tool for cost estimating. In later phases they are usually used as an estimation validation tool and sometimes for benchmarking purposes. As in any other modeling methodology there are number of important steps to build a model. In this paper the process of building a CER to estimate drilling cost of onshore wells will be addressed.
2013-01-01
Background Administrative databases are widely available and have been extensively used to provide estimates of chronic disease prevalence for the purpose of surveillance of both geographical and temporal trends. There are, however, other sources of data available, such as medical records from primary care and national surveys. In this paper we compare disease prevalence estimates obtained from these three different data sources. Methods Data from general practitioners (GP) and administrative transactions for health services were collected from five Italian regions (Veneto, Emilia Romagna, Tuscany, Marche and Sicily) belonging to all the three macroareas of the country (North, Center, South). Crude prevalence estimates were calculated by data source and region for diabetes, ischaemic heart disease, heart failure and chronic obstructive pulmonary disease (COPD). For diabetes and COPD, prevalence estimates were also obtained from a national health survey. When necessary, estimates were adjusted for completeness of data ascertainment. Results Crude prevalence estimates of diabetes in administrative databases (range: from 4.8% to 7.1%) were lower than corresponding GP (6.2%-8.5%) and survey-based estimates (5.1%-7.5%). Geographical trends were similar in the three sources and estimates based on treatment were the same, while estimates adjusted for completeness of ascertainment (6.1%-8.8%) were slightly higher. For ischaemic heart disease administrative and GP data sources were fairly consistent, with prevalence ranging from 3.7% to 4.7% and from 3.3% to 4.9%, respectively. In the case of heart failure administrative estimates were consistently higher than GPs’ estimates in all five regions, the highest difference being 1.4% vs 1.1%. For COPD the estimates from administrative data, ranging from 3.1% to 5.2%, fell into the confidence interval of the Survey estimates in four regions, but failed to detect the higher prevalence in the most Southern region (4.0% in
General presentation of projects mechanisms
International Nuclear Information System (INIS)
2003-01-01
This guide provides recommendations and tools to implement projects mechanisms, in the framework of the kyoto protocol. It precises the place of the projects mechanisms in the display of tools involved in the climatic change fight policies, at the national as international scale. It recalls the main characteristics and the rules of utilization and illustrates the corresponding interests. (A.L.B.)
Estimating software development project size, using probabilistic ...
African Journals Online (AJOL)
Estimating software development project size, using probabilistic techniques. ... of managing the size of software development projects by Purchasers (Clients) and Vendors (Development ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT
General aspects of project organization
International Nuclear Information System (INIS)
Staebler, K.
1975-01-01
For power plant construction and errection the Federal Republic of Germany has a very well experienced industry with remarkable results in the past time. The 'bottle-neck' is not so much the design and manufacturing, it is also not in the area of sub-suppliers but it is more the personal for mounting and commissioning of the plant. In the report are only touched the various problems of project organization by the point of view of the utility. (orig./FW) [de
Accuracy of hazardous waste project estimates
International Nuclear Information System (INIS)
Hackney, J.W.
1989-01-01
The HAZRATE system has been developed to appraise the current state of definition of hazardous waste remedial projects. This is shown to have a high degree of correlation to the financial risk of such projects. The method employs a weighted checklist indicating the current degree of definition of some 150 significant project elements. It is based on the author's experience with a similar system for establishing the risk characteristics of process plant projects (Hackney, 1965 and 1989; 1985). In this paper definition ratings for 15 hazardous waste remedial projects have been correlated with the excesses of their actual costs over their base estimates, excluding any allowances for contingencies. Equations are presented, based on this study, for computation of the contingency allowance needed and estimate accuracy possible at a given stage of project development
Thresholding projection estimators in functional linear models
Cardot, Hervé; Johannes, Jan
2010-01-01
We consider the problem of estimating the regression function in functional linear regression models by proposing a new type of projection estimators which combine dimension reduction and thresholding. The introduction of a threshold rule allows to get consistency under broad assumptions as well as minimax rates of convergence under additional regularity hypotheses. We also consider the particular case of Sobolev spaces generated by the trigonometric basis which permits to get easily mean squ...
Variance in parametric images: direct estimation from parametric projections
International Nuclear Information System (INIS)
Maguire, R.P.; Leenders, K.L.; Spyrou, N.M.
2000-01-01
Recent work has shown that it is possible to apply linear kinetic models to dynamic projection data in PET in order to calculate parameter projections. These can subsequently be back-projected to form parametric images - maps of parameters of physiological interest. Critical to the application of these maps, to test for significant changes between normal and pathophysiology, is an assessment of the statistical uncertainty. In this context, parametric images also include simple integral images from, e.g., [O-15]-water used to calculate statistical parametric maps (SPMs). This paper revisits the concept of parameter projections and presents a more general formulation of the parameter projection derivation as well as a method to estimate parameter variance in projection space, showing which analysis methods (models) can be used. Using simulated pharmacokinetic image data we show that a method based on an analysis in projection space inherently calculates the mathematically rigorous pixel variance. This results in an estimation which is as accurate as either estimating variance in image space during model fitting, or estimation by comparison across sets of parametric images - as might be done between individuals in a group pharmacokinetic PET study. The method based on projections has, however, a higher computational efficiency, and is also shown to be more precise, as reflected in smooth variance distribution images when compared to the other methods. (author)
A General Model for Estimating Macroevolutionary Landscapes.
Boucher, Florian C; Démery, Vincent; Conti, Elena; Harmon, Luke J; Uyeda, Josef
2018-03-01
The evolution of quantitative characters over long timescales is often studied using stochastic diffusion models. The current toolbox available to students of macroevolution is however limited to two main models: Brownian motion and the Ornstein-Uhlenbeck process, plus some of their extensions. Here, we present a very general model for inferring the dynamics of quantitative characters evolving under both random diffusion and deterministic forces of any possible shape and strength, which can accommodate interesting evolutionary scenarios like directional trends, disruptive selection, or macroevolutionary landscapes with multiple peaks. This model is based on a general partial differential equation widely used in statistical mechanics: the Fokker-Planck equation, also known in population genetics as the Kolmogorov forward equation. We thus call the model FPK, for Fokker-Planck-Kolmogorov. We first explain how this model can be used to describe macroevolutionary landscapes over which quantitative traits evolve and, more importantly, we detail how it can be fitted to empirical data. Using simulations, we show that the model has good behavior both in terms of discrimination from alternative models and in terms of parameter inference. We provide R code to fit the model to empirical data using either maximum-likelihood or Bayesian estimation, and illustrate the use of this code with two empirical examples of body mass evolution in mammals. FPK should greatly expand the set of macroevolutionary scenarios that can be studied since it opens the way to estimating macroevolutionary landscapes of any conceivable shape. [Adaptation; bounds; diffusion; FPK model; macroevolution; maximum-likelihood estimation; MCMC methods; phylogenetic comparative data; selection.].
Wideband DOA Estimation through Projection Matrix Interpolation
Selva, J.
2017-01-01
This paper presents a method to reduce the complexity of the deterministic maximum likelihood (DML) estimator in the wideband direction-of-arrival (WDOA) problem, which is based on interpolating the array projection matrix in the temporal frequency variable. It is shown that an accurate interpolator like Chebyshev's is able to produce DML cost functions comprising just a few narrowband-like summands. Actually, the number of such summands is far smaller (roughly by factor ten in the numerical ...
Laser projection using generalized phase contrast
DEFF Research Database (Denmark)
Glückstad, Jesper; Palima, Darwin; Rodrigo, Peter John
2007-01-01
is introduced. An arbitrary phase shift filter eliminates the need for high-frequency modulation and conjugate phase encoding. This lowers device performance requirements and allows practical implementation with currently available dynamic spatial light modulators. (c) 2007 Optical Society of America.......We demonstrate experimental laser projection of a gray-level photographic image with 74% light efficiency using the generalized phase contrast (GPC) method. In contrast with a previously proposed technique [Alonzo et al., New J. Phys. 9, 132 (2007)], a new approach to image construction via GPC...
Generalized Jackknife Estimators of Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
With the aim of improving the quality of asymptotic distributional approximations for nonlinear functionals of nonparametric estimators, this paper revisits the large-sample properties of an important member of that class, namely a kernel-based weighted average derivative estimator. Asymptotic...
International Nuclear Information System (INIS)
Lang, Corey; Siler, Matthew
2013-01-01
Energy efficiency upgrades have been gaining widespread attention across global channels as a cost-effective approach to addressing energy challenges. The cost-effectiveness of these projects is generally predicted using engineering estimates pre-implementation, often with little ex post analysis of project success. In this paper, for a suite of energy efficiency projects, we directly compare ex ante engineering estimates of energy savings to ex post econometric estimates that use 15-min interval, building-level energy consumption data. In contrast to most prior literature, our econometric results confirm the engineering estimates, even suggesting the engineering estimates were too modest. Further, we find heterogeneous efficiency impacts by time of day, suggesting select efficiency projects can be useful in reducing peak load. - Highlights: • Regression discontinuity used to estimate energy savings from efficiency projects. • Ex post econometric estimates validate ex ante engineering estimates of energy savings. • Select efficiency projects shown to reduce peak load
Generalized shrunken type-GM estimator and its application
International Nuclear Information System (INIS)
Ma, C Z; Du, Y L
2014-01-01
The parameter estimation problem in linear model is considered when multicollinearity and outliers exist simultaneously. A class of new robust biased estimator, Generalized Shrunken Type-GM Estimation, with their calculated methods are established by combination of GM estimator and biased estimator include Ridge estimate, Principal components estimate and Liu estimate and so on. A numerical example shows that the most attractive advantage of these new estimators is that they can not only overcome the multicollinearity of coefficient matrix and outliers but also have the ability to control the influence of leverage points
Generalized shrunken type-GM estimator and its application
Ma, C. Z.; Du, Y. L.
2014-03-01
The parameter estimation problem in linear model is considered when multicollinearity and outliers exist simultaneously. A class of new robust biased estimator, Generalized Shrunken Type-GM Estimation, with their calculated methods are established by combination of GM estimator and biased estimator include Ridge estimate, Principal components estimate and Liu estimate and so on. A numerical example shows that the most attractive advantage of these new estimators is that they can not only overcome the multicollinearity of coefficient matrix and outliers but also have the ability to control the influence of leverage points.
A Generalized Autocovariance Least-Squares Method for Covariance Estimation
DEFF Research Database (Denmark)
Åkesson, Bernt Magnus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad
2007-01-01
A generalization of the autocovariance least- squares method for estimating noise covariances is presented. The method can estimate mutually correlated system and sensor noise and can be used with both the predicting and the filtering form of the Kalman filter.......A generalization of the autocovariance least- squares method for estimating noise covariances is presented. The method can estimate mutually correlated system and sensor noise and can be used with both the predicting and the filtering form of the Kalman filter....
Estimation of Externalities for Juragua Nuclear Project
International Nuclear Information System (INIS)
Mora, H. R.; Carbonell, L. T.
2002-01-01
Estimation of externalities allows taking into account environmental impacts due to any activity in total costs calculation. In the present work, the external costs of electricity generation from nuclear energy were calculated considering three scenarios: normal operation (routine releases), accident situation and solid waste disposal. A comparison between these results and those obtained for electricity generation from fossil fuels was made. IAEA proposals of Simplified methodologies were used for externality calculations. The Juragua project was selected as a study case; it is based in two energetic blocks both PWR, VVER 440/318 type with a plant capacity of 417 MWe each. Four impact ways were considered for all scenarios: (1) Inhalation of radionuclides in the air, (2) External irradiation from radionuclides immersed in clouds, (3) External irradiation from deposited radionuclides and (4) Ingestion of radionuclides in agricultural products. Besides, two impact categories (local and regional) for all scenarios were considered. The total cost of externalities was 0.01425 c/kWh, value smaller than the one obtained for electricity generation from fossil fuel (0.256 c/kWh). For the normal operation scenario, the external cost calculated was 0.00112 c/kWh, for accident situation 0.01103 c/kWh, and for the solid wastes management scenario 0.0021 c/kWh. The high value obtained for solid waste disposal scenario is due to repository placement features. (author)
Predicting Software Projects Cost Estimation Based on Mining Historical Data
Najadat, Hassan; Alsmadi, Izzat; Shboul, Yazan
2012-01-01
In this research, a hybrid cost estimation model is proposed to produce a realistic prediction model that takes into consideration software project, product, process, and environmental elements. A cost estimation dataset is built from a large number of open source projects. Those projects are divided into three domains: communication, finance, and game projects. Several data mining techniques are used to classify software projects in terms of their development complexity. Data mining techniqu...
Crash data modeling with a generalized estimator.
Ye, Zhirui; Xu, Yueru; Lord, Dominique
2018-05-11
The investigation of relationships between traffic crashes and relevant factors is important in traffic safety management. Various methods have been developed for modeling crash data. In real world scenarios, crash data often display the characteristics of over-dispersion. However, on occasions, some crash datasets have exhibited under-dispersion, especially in cases where the data are conditioned upon the mean. The commonly used models (such as the Poisson and the NB regression models) have associated limitations to cope with various degrees of dispersion. In light of this, a generalized event count (GEC) model, which can be generally used to handle over-, equi-, and under-dispersed data, is proposed in this study. This model was first applied to case studies using data from Toronto, characterized by over-dispersion, and then to crash data from railway-highway crossings in Korea, characterized with under-dispersion. The results from the GEC model were compared with those from the Negative binomial and the hyper-Poisson models. The cases studies show that the proposed model provides good performance for crash data characterized with over- and under-dispersion. Moreover, the proposed model simplifies the modeling process and the prediction of crash data. Copyright © 2018 Elsevier Ltd. All rights reserved.
Highway project cost estimating and management.
2009-02-01
"This report provides detailed information about the project objectives, deliverables, and findings. The project team : thoroughly reviewed the Montana Department of Transportation (MDT) structure, operations, and current procedures as : related to M...
Parameter Estimation for a Computable General Equilibrium Model
DEFF Research Database (Denmark)
Arndt, Channing; Robinson, Sherman; Tarp, Finn
2002-01-01
We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of non-linear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...
Parameter Estimation for a Computable General Equilibrium Model
DEFF Research Database (Denmark)
Arndt, Channing; Robinson, Sherman; Tarp, Finn
We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of nonlinear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...
Project cost estimation techniques used by most emerging building ...
African Journals Online (AJOL)
Keywords: Cost estimation, estimation methods, emerging contractors, tender. Dr Solly Matshonisa .... historical cost data (data from cost accounting records and/ ..... emerging contractors in tendering. Table 13: Use of project risk management versus responsibility: expected. Internal document analysis. Checklist analysis.
On the projective curvature tensor of generalized Sasakian-space ...
African Journals Online (AJOL)
space-forms under some conditions regarding projective curvature tensor. All the results obtained in this paper are in the form of necessary and sufficient conditions. Keywords: Generalized Sasakian-space-forms; projectively flat; ...
The generalized back projection theorem for cone beam reconstruction
International Nuclear Information System (INIS)
Peyrin, F.C.
1985-01-01
The use of cone beam scanners raises the problem of three dimensional reconstruction from divergent projections. After a survey on bidimensional analytical reconstruction methods we examine their application to the 3D problem. Finally, it is shown that the back projection theorem can be generalized to cone beam projections. This allows to state a new inversion formula suitable for both the 4 π parallel and divergent geometries. It leads to the generalization of the ''rho-filtered back projection'' algorithm which is outlined
How Agile Methods Conquers General Project Management - The Project Half Double Initiative
DEFF Research Database (Denmark)
Tordrup Heeager, Lise; Svejvig, Per; Schlichter, Bjarne Rerup
2016-01-01
Increased complexity in projects has forced new project management initiatives. In software development several agile methods have emerged and methods such as Scrum are today highly implemented in practice. General project management practice has been inspired by agile software development...
Estimating design costs for first-of-a-kind projects
International Nuclear Information System (INIS)
Banerjee, Bakul; Fermilab
2006-01-01
Modern scientific facilities are often outcomes of projects that are first-of-a-kind, that is, minimal historical data are available for project costs and schedules. However, at Fermilab, there was an opportunity to execute two similar projects consecutively. In this paper, a comparative study of the design costs for these two projects is presented using earned value methodology. This study provides some insights into how to estimate the cost of a replicated project
ESTIMATION CRITERIA OF INVESTMENT INNOVATIVE PROJECTS IN HEALTHCARE SPHERE
Vadim A. Lomazov; Elena V. Nesterova
2013-01-01
The problem of evaluating investment projects in the health sector, implemented on a public-private partnership, is considered. As part of the procedure for constructing estimates is suggested to reveal medical, social, economic, scientific and innovative components of the project as a separate component of the project to be used as the analytic hierarchy subcriteria.
ESTIMATION CRITERIA OF INVESTMENT INNOVATIVE PROJECTS IN HEALTHCARE SPHERE
Directory of Open Access Journals (Sweden)
Vadim A. Lomazov
2013-01-01
Full Text Available The problem of evaluating investment projects in the health sector, implemented on a public-private partnership, is considered. As part of the procedure for constructing estimates is suggested to reveal medical, social, economic, scientific and innovative components of the project as a separate component of the project to be used as the analytic hierarchy subcriteria.
Cost estimating for large nuclear projects
International Nuclear Information System (INIS)
Duggal, A.; Hunt, M.
2004-01-01
In today's market, the generation of electricity is a very competitive business, which is constantly under the watchful eye of the media and public. Nuclear power faces a lot of competition from other sources such as hydro, coal and gas. Controlling costs, monitoring costs, feedback, industry knowledge and up to date cost estimating tools are essential for a nuclear company to compete on a long term basis. This paper reviews the terminology and estimating principles used for the construction of new nuclear plants, lifetime operating costs, and the costs associated with refurbishment work. (author)
The DECADES project - Outline and general overview
International Nuclear Information System (INIS)
1995-01-01
The environmental and health-related impacts of different energy systems, including those associated with the production of electricity, are emerging as significant issues for the coming decades. This interest is highlighted by the current debate about health effects of pollution, environmental damages due to acidification of forests and lakes, concerns about the safety of nuclear power plants and radioactive waste management, and the potential risks of global climate change induced by increasing atmospheric concentrations of carbon dioxide and other greenhouse gases. All fuel chains within the electricity generation system involve some health risks and lead to some environmental impacts. This fact, together with the emerging needs of many countries to define their energy programmes for the coming decades, has provided the basis for a growing interest in the application of improved data, tools and techniques for comparative assessment of different electricity generating systems, particularly from an environmental and human health viewpoint. The need to design and implement sustainable strategies in the electricity sector has been stressed in many international fora such as the Senior Expert Symposium on Electricity and the Environment (Helsinki, 1991), the United Nations Conference on Environment and Development (Rio de Janeiro, 1992) and the 15th Conference of the World Energy Council (Madrid, 1992). The essential goal of sustainable strategies is to provide the energy services required for supporting economic growth and improving quality of life, especially in developing countries, while minimising the health and environmental impacts of human activities. The inter-agency joint project on data bases and methodologies for comparative assessment of different energy sources for electricity generation [DECADES] has been established with the objective of enhancing capabilities for incorporating health and environmental issues in the comparative assessment of different
AegeanMarTech project: General Introduction
Psarra, S.; Zervakis, V.; Karageorgis, A. P.
2017-10-01
This issue of "Continental Shelf Research" is dedicated to the study of processes potentially responsible for the relatively high productivity of the North Aegean Sea in comparison to other regions of the Eastern Mediterranean. This region, one of the most important fishing grounds in the eastern Mediterranean, is characterized by: i) the inflow of mesotrophic waters of Black Sea (BSW) origin into the North Aegean and their interaction with the more saline Levantine waters (LW); and ii) the wind-generated coastal upwelling occurring every summer in the eastern Aegean. The study of these two natural fertilization mechanisms has been the major aim of the AegeanMarTech project ("Technological and oceanographic cooperation Network for the Study of mechanisms fertilizing the North-East Aegean Sea").
Estimation of volatility of selected oil production projects
International Nuclear Information System (INIS)
Costa Lima, Gabriel A.; Suslick, Saul B.
2006-01-01
In oil project valuation and investment decision-making, volatility is a key parameter, but it is difficult to estimate. From a traditional investment viewpoint, volatility reduces project value because it increases its discount rate via a higher risk premium. Contrarily, according to the real-option pricing theory, volatility may aggregate value to the project, since the downside potential is limited whereas the upside is theoretically unbounded. However, the estimation of project volatility is very complicated since there is not a historical series of project values. In such cases, many analysts assume that oil price volatility is equal to that of project. In order to overcome such problems, in this paper an alternative numerical method based on present value of future cash flows and Monte Carlo simulation is proposed to estimate the volatility of projects. This method is applied to estimate the volatility of 12 deep-water offshore oil projects considering that oil price will evolve according to one of two stochastic processes: Geometric Brownian Motion and Mean-Reverting Motion. Results indicate that the volatility of commodity usually undervalue that of project. For the set of offshore projects analyzed in this paper, project volatility is at least 79% higher than that of oil prices and increases dramatically in those cases of high capital expenditures and low price. (author)
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Explicit estimating equations for semiparametric generalized linear latent variable models
Ma, Yanyuan
2010-07-05
We study generalized linear latent variable models without requiring a distributional assumption of the latent variables. Using a geometric approach, we derive consistent semiparametric estimators. We demonstrate that these models have a property which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n consistency and asymptotic normality. We explain the computational implementation of our method and illustrate the numerical performance of the estimators in finite sample situations via extensive simulation studies. The advantage of our estimators over the existing likelihood approach is also shown via numerical comparison. We employ the method to analyse a real data example from economics. © 2010 Royal Statistical Society.
Estimating pediatric general anesthesia exposure: Quantifying duration and risk.
Bartels, Devan Darby; McCann, Mary Ellen; Davidson, Andrew J; Polaner, David M; Whitlock, Elizabeth L; Bateman, Brian T
2018-05-02
Understanding the duration of pediatric general anesthesia exposure in contemporary practice is important for identifying groups at risk for long general anesthesia exposures and designing trials examining associations between general anesthesia exposure and neurodevelopmental outcomes. We performed a retrospective cohort analysis to estimate pediatric general anesthesia exposure duration during 2010-2015 using the National Anesthesia Clinical Outcomes Registry. A total of 1 548 021 pediatric general anesthetics were included. Median general anesthesia duration was 57 minutes (IQR: 28-86) with 90th percentile 145 minutes. Children aged 3 hours. High ASA physical status and care at a university hospital were associated with longer exposure times. While the vast majority (94%) of children undergoing general anesthesia are exposed for risk for longer exposures. These findings may help guide the design of future trials aimed at understanding neurodevelopmental impact of prolonged exposure in these high-risk groups. © 2018 John Wiley & Sons Ltd.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Cost Estimation for Cross-organizational ERP Projects: Research Perspectives
Daneva, Maia; Bieman, J.; Wieringa, Roelf J.
There are many methods for estimating size, effort, schedule and other cost aspects of IS projects, but only one specifically developed for Enterprise Resource Planning (ERP) [67] and none for simultaneous, interdependent ERP projects in a cross-organizational context. The objective of this paper is
Generalized projective synchronization of a unified chaotic system
International Nuclear Information System (INIS)
Yan Jianping; Li Changpin
2005-01-01
In the present paper, a simple but efficient control technique of the generalized projective synchronization is applied to a unified chaotic system. Numerical simulations show that this method works very well, which can also be applied to other chaotic systems
Estimation and variable selection for generalized additive partial linear models
Wang, Li
2011-08-01
We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
Robust estimators based on generalization of trimmed mean
Czech Academy of Sciences Publication Activity Database
Adam, Lukáš; Bejda, P.
(2018) ISSN 0361-0918 Institutional support: RVO:67985556 Keywords : Breakdown point * Estimators * Geometric median * Location * Trimmed mean Subject RIV: BA - General Mathematics Impact factor: 0.457, year: 2016 http://library.utia.cas.cz/separaty/2017/MTR/adam-0481224.pdf
Disease prevalence estimations based on contact registrations in general practice
Hoogenveen, Rudolf; Westert, Gert; Dijkgraaf, Marcel; Schellevis, François; de Bakker, Dinny
2002-01-01
This paper describes how to estimate the prevalence of chronic diseases in a population using data from contact registrations in general practice with a limited time length. Instead of using only total numbers of observed patients adjusted for the length of the observation period, we propose the use
Estimating the greenhouse gas benefits of forestry projects: A Costa Rican Case Study
Energy Technology Data Exchange (ETDEWEB)
Busch, Christopher; Sathaye, Jayant; Sanchez Azofeifa, G. Arturo
2000-09-01
If the Clean Development Mechanism proposed under the Kyoto Protocol is to serve as an effective means for combating global climate change, it will depend upon reliable estimates of greenhouse gas benefits. This paper sketches the theoretical basis for estimating the greenhouse gas benefits of forestry projects and suggests lessons learned based on a case study of Costa Rica's Protected Areas Project, which is a 500,000 hectare effort to reduce deforestation and enhance reforestation. The Protected Areas Project in many senses advances the state of the art for Clean Development Mechanism-type forestry projects, as does the third-party verification work of SGS International Certification Services on the project. Nonetheless, sensitivity analysis shows that carbon benefit estimates for the project vary widely based on the imputed deforestation rate in the baseline scenario, e.g. the deforestation rate expected if the project were not implemented. This, along with a newly available national dataset that confirms other research showing a slower rate of deforestation in Costa Rica, suggests that the use of the 1979--1992 forest cover data originally as the basis for estimating carbon savings should be reconsidered. When the newly available data is substituted, carbon savings amount to 8.9 Mt (million tones) of carbon, down from the original estimate of 15.7 Mt. The primary general conclusion is that project developers should give more attention to the forecasting land use and land cover change scenarios underlying estimates of greenhouse gas benefits.
Generalized projective synchronization of chaotic systems via adaptive learning control
International Nuclear Information System (INIS)
Yun-Ping, Sun; Jun-Min, Li; Hui-Lin, Wang; Jiang-An, Wang
2010-01-01
In this paper, a learning control approach is applied to the generalized projective synchronisation (GPS) of different chaotic systems with unknown periodically time-varying parameters. Using the Lyapunov–Krasovskii functional stability theory, a differential-difference mixed parametric learning law and an adaptive learning control law are constructed to make the states of two different chaotic systems asymptotically synchronised. The scheme is successfully applied to the generalized projective synchronisation between the Lorenz system and Chen system. Moreover, numerical simulations results are used to verify the effectiveness of the proposed scheme. (general)
IMPROVING PROJECT SCHEDULE ESTIMATES USING HISTORICAL DATA AND SIMULATION
Directory of Open Access Journals (Sweden)
P.H. Meyer
2012-01-01
Full Text Available
ENGLISH ABSTRACT: Many projects are not completed on time or within the original budget. This is caused by uncertainty in project variables as well as the occurrence of risk events. A study was done to determine ways of measuring the risk in development projects executed by a mining company in South Africa. The main objective of the study was to determine whether historical project data would provide a more accurate means of estimating the total project duration. Original estimates and actual completion times for tasks of a number of projects were analysed and compared. The results of the study indicated that a more accurate total duration for a project could be obtained by making use of historical project data. The accuracy of estimates could be improved further by building a comprehensive project schedule database within a specific industry.
AFRIKAANSE OPSOMMING: Verskeie projekte word nie binne die oorspronklike skedule of begroting voltooi nie. Dit word dikwels veroorsaak deur onsekerheid oor projekveranderlikes en die voorkoms van risiko’s. 'n Studie is gedoen om 'n metode te ontwikkel om risiko te meet vir ontwikkelingsprojekte van 'n mynmaatskappy in Suid Afrika. Die hoofdoel van die studie was om te bepaal of historiese projekdata gebruik kon word om 'n akkurater tydsduur vir 'n projek te beraam. Die geraamde tydsduur van take vir 'n aantal projekte is ontleed en vergelyk met die werklike tydsduur. Die resultate van die studie het getoon dat 'n akkurater totale tydsduur vir die projek verkry kon word deur gebruik te maak van historiese projekdata. Die akkuraatheid kan verder verbeter word deur 'n databasis van projekskedules vir 'n bepaalde industrie te ontwikkel en by datum te hou.
The Process to Estimate Economical Benefits of Six Sigma Projects
Directory of Open Access Journals (Sweden)
Jan Kosina
2013-07-01
Full Text Available This paper seeks to define the process for the continuous evaluation of the financial benefits during Six Sigma project life time. The financial criteria are critical success factors of a Six Sigma project. The process has been developed as part of the six sigma project monitoring in order to estimate proper allocation of the resources taking in account the expected project benefits as well as evaluationof real achievements. The evaluation of the finacial benefits based on the quality costs is not sufficient in the real life and has to be accomplished with key financial performance indicators of the business to visualize the results. The evaluation based on the savings seems to be too difficult especially for green belts. The early involvement of the finance department in the project definition as well as ongoing evaluation is key. The defined process has been applied to real business enviroment.
Bayes estimation of the general hazard rate model
International Nuclear Information System (INIS)
Sarhan, A.
1999-01-01
In reliability theory and life testing models, the life time distributions are often specified by choosing a relevant hazard rate function. Here a general hazard rate function h(t)=a+bt c-1 , where c, a, b are constants greater than zero, is considered. The parameter c is assumed to be known. The Bayes estimators of (a,b) based on the data of type II/item-censored testing without replacement are obtained. A large simulation study using Monte Carlo Method is done to compare the performance of Bayes with regression estimators of (a,b). The criterion for comparison is made based on the Bayes risk associated with the respective estimator. Also, the influence of the number of failed items on the accuracy of the estimators (Bayes and regression) is investigated. Estimations for the parameters (a,b) of the linearly increasing hazard rate model h(t)=a+bt, where a, b are greater than zero, can be obtained as the special case, letting c=2
Kinetic parameter estimation from SPECT cone-beam projection measurements
International Nuclear Information System (INIS)
Huesman, Ronald H.; Reutter, Bryan W.; Zeng, G. Larry; Gullberg, Grant T.
1998-01-01
Kinetic parameters are commonly estimated from dynamically acquired nuclear medicine data by first reconstructing a dynamic sequence of images and subsequently fitting the parameters to time-activity curves generated from regions of interest overlaid upon the image sequence. Biased estimates can result from images reconstructed using inconsistent projections of a time-varying distribution of radiopharmaceutical acquired by a rotating SPECT system. If the SPECT data are acquired using cone-beam collimators wherein the gantry rotates so that the focal point of the collimators always remains in a plane, additional biases can arise from images reconstructed using insufficient, as well as truncated, projection samples. To overcome these problems we have investigated the estimation of kinetic parameters directly from SPECT cone-beam projection data by modelling the data acquisition process. To accomplish this it was necessary to parametrize the spatial and temporal distribution of the radiopharmaceutical within the SPECT field of view. In a simulated chest image volume, kinetic parameters were estimated for simple one-compartment models for four myocardial regions of interest. Myocardial uptake and washout parameters estimated by conventional analysis of noiseless simulated cone-beam data had biases ranging between 3-26% and 0-28%, respectively. Parameters estimated directly from the noiseless projection data were unbiased as expected, since the model used for fitting was faithful to the simulation. Statistical uncertainties of parameter estimates for 10 000 000 events ranged between 0.2-9% for the uptake parameters and between 0.3-6% for the washout parameters. (author)
Kinetic parameter estimation from attenuated SPECT projection measurements
International Nuclear Information System (INIS)
Reutter, B.W.; Gullberg, G.T.
1998-01-01
Conventional analysis of dynamically acquired nuclear medicine data involves fitting kinetic models to time-activity curves generated from regions of interest defined on a temporal sequence of reconstructed images. However, images reconstructed from the inconsistent projections of a time-varying distribution of radiopharmaceutical acquired by a rotating SPECT system can contain artifacts that lead to biases in the estimated kinetic parameters. To overcome this problem the authors investigated the estimation of kinetic parameters directly from projection data by modeling the data acquisition process. To accomplish this it was necessary to parametrize the spatial and temporal distribution of the radiopharmaceutical within the SPECT field of view. In a simulated transverse slice, kinetic parameters were estimated for simple one compartment models for three myocardial regions of interest, as well as for the liver. Myocardial uptake and washout parameters estimated by conventional analysis of noiseless simulated data had biases ranging between 1--63%. Parameters estimated directly from the noiseless projection data were unbiased as expected, since the model used for fitting was faithful to the simulation. Predicted uncertainties (standard deviations) of the parameters obtained for 500,000 detected events ranged between 2--31% for the myocardial uptake parameters and 2--23% for the myocardial washout parameters
Structures of generalized 3-circular projections for symmetric norms
Indian Academy of Sciences (India)
Generalized bi-circular projection has been studied by many authors (see the subse- quent paragraph and references at the end of this paper). In particular, Botelho [4] and. Botelho and Jamison [5–8] extensively investigated the structures of GBPs for different. Banach spaces whose isometry group has concrete description ...
General projective relativity and the vector-tensor gravitational field
International Nuclear Information System (INIS)
Arcidiacono, G.
1986-01-01
In the general projective relativity, the induced 4-dimensional metric is symmetric in three cases, and we obtain the vector-tensor, the scalar-tensor, and the scalar-vector-tensor theories of gravitation. In this work we examine the vector-tensor theory, similar to the Veblen's theory, but with a different physical interpretation
Controlling general projective synchronization of fractional order Rossler systems
International Nuclear Information System (INIS)
Shao Shiquan
2009-01-01
This paper proposed a method to achieve general projective synchronization of two fractional order Rossler systems. First, we construct the fractional order Rossler system's corresponding approximation integer order system. Then, a control method based on a partially linear decomposition and negative feedback of state errors was utilized on the integer order system. Numerical simulations show the effectiveness of the proposed method.
General criteria for the project of nuclear fuel reprocessing plants
International Nuclear Information System (INIS)
1979-01-01
Recommendations are presented establishing the general criteria for the project of nuclear fuel reprocessing plants to be licensed according to the legislation in effect. They apply to all the plant's systems, components and structures which are important to operation safety and to the public's health and safety. (F.E.) [pt
Development and Assessment of Service Learning Projects in General Biology
Felzien, Lisa; Salem, Laura
2008-01-01
Service learning involves providing service to the community while requiring students to meet learning goals in a specific course. A service learning project was implemented in a general biology course at Rockhurst University to involve students in promoting scientific education in conjunction with community partner educators. Students were…
Generalized projective synchronization between Lorenz system and Chen's system
International Nuclear Information System (INIS)
Li Guohui
2007-01-01
On the basis of active backstepping design, this paper presents the generalized projective synchronization between two different chaotic systems: Lorenz system and Chen's system. The proposed method combines backstepping methods and active control without having to calculate the Lyapunov exponents and the eigenvalues of the Jacobian matrix, which makes it simple and convenient. Numerical simulations show that this method works very well
Developing a generalized allometric equation for aboveground biomass estimation
Xu, Q.; Balamuta, J. J.; Greenberg, J. A.; Li, B.; Man, A.; Xu, Z.
2015-12-01
A key potential uncertainty in estimating carbon stocks across multiple scales stems from the use of empirically calibrated allometric equations, which estimate aboveground biomass (AGB) from plant characteristics such as diameter at breast height (DBH) and/or height (H). The equations themselves contain significant and, at times, poorly characterized errors. Species-specific equations may be missing. Plant responses to their local biophysical environment may lead to spatially varying allometric relationships. The structural predictor may be difficult or impossible to measure accurately, particularly when derived from remote sensing data. All of these issues may lead to significant and spatially varying uncertainties in the estimation of AGB that are unexplored in the literature. We sought to quantify the errors in predicting AGB at the tree and plot level for vegetation plots in California. To accomplish this, we derived a generalized allometric equation (GAE) which we used to model the AGB on a full set of tree information such as DBH, H, taxonomy, and biophysical environment. The GAE was derived using published allometric equations in the GlobAllomeTree database. The equations were sparse in details about the error since authors provide the coefficient of determination (R2) and the sample size. A more realistic simulation of tree AGB should also contain the noise that was not captured by the allometric equation. We derived an empirically corrected variance estimate for the amount of noise to represent the errors in the real biomass. Also, we accounted for the hierarchical relationship between different species by treating each taxonomic level as a covariate nested within a higher taxonomic level (e.g. species contribution of each different covariate in estimating the AGB of trees. Lastly, we applied the GAE to an existing vegetation plot database - Forest Inventory and Analysis database - to derive per-tree and per-plot AGB estimations, their errors, and how
Estimating the parameters of a generalized lambda distribution
International Nuclear Information System (INIS)
Fournier, B.; Rupin, N.; Najjar, D.; Iost, A.; Rupin, N.; Bigerelle, M.; Wilcox, R.; Fournier, B.
2007-01-01
The method of moments is a popular technique for estimating the parameters of a generalized lambda distribution (GLD), but published results suggest that the percentile method gives superior results. However, the percentile method cannot be implemented in an automatic fashion, and automatic methods, like the starship method, can lead to prohibitive execution time with large sample sizes. A new estimation method is proposed that is automatic (it does not require the use of special tables or graphs), and it reduces the computational time. Based partly on the usual percentile method, this new method also requires choosing which quantile u to use when fitting a GLD to data. The choice for u is studied and it is found that the best choice depends on the final goal of the modeling process. The sampling distribution of the new estimator is studied and compared to the sampling distribution of estimators that have been proposed. Naturally, all estimators are biased and here it is found that the bias becomes negligible with sample sizes n ≥ 2 * 10(3). The.025 and.975 quantiles of the sampling distribution are investigated, and the difference between these quantiles is found to decrease proportionally to 1/root n.. The same results hold for the moment and percentile estimates. Finally, the influence of the sample size is studied when a normal distribution is modeled by a GLD. Both bounded and unbounded GLDs are used and the bounded GLD turns out to be the most accurate. Indeed it is shown that, up to n = 10(6), bounded GLD modeling cannot be rejected by usual goodness-of-fit tests. (authors)
Working covariance model selection for generalized estimating equations.
Carey, Vincent J; Wang, You-Gan
2011-11-20
We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.
Estimating and Forecasting Generalized Fractional Long Memory Stochastic Volatility Models
Directory of Open Access Journals (Sweden)
Shelton Peiris
2017-12-01
Full Text Available This paper considers a flexible class of time series models generated by Gegenbauer polynomials incorporating the long memory in stochastic volatility (SV components in order to develop the General Long Memory SV (GLMSV model. We examine the corresponding statistical properties of this model, discuss the spectral likelihood estimation and investigate the finite sample properties via Monte Carlo experiments. We provide empirical evidence by applying the GLMSV model to three exchange rate return series and conjecture that the results of out-of-sample forecasts adequately confirm the use of GLMSV model in certain financial applications.
Project cost estimation techniques used by most emerging building ...
African Journals Online (AJOL)
consisted of five distinct types of contractors, general builders, civil engineers, electricians ... Mmboswobeni Watson Ladzani, Department of Business Management, University of South .... final cost of a proposed project for a given work scope. Thus, by .... Since the test statistic does not exceed the critical region, the null.
Emergency general surgery: definition and estimated burden of disease.
Shafi, Shahid; Aboutanos, Michel B; Agarwal, Suresh; Brown, Carlos V R; Crandall, Marie; Feliciano, David V; Guillamondegui, Oscar; Haider, Adil; Inaba, Kenji; Osler, Turner M; Ross, Steven; Rozycki, Grace S; Tominaga, Gail T
2013-04-01
Acute care surgery encompasses trauma, surgical critical care, and emergency general surgery (EGS). While the first two components are well defined, the scope of EGS practice remains unclear. This article describes the work of the American Association for the Surgery of Trauma to define EGS. A total of 621 unique International Classification of Diseases-9th Rev. (ICD-9) diagnosis codes were identified using billing data (calendar year 2011) from seven large academic medical centers that practice EGS. A modified Delphi methodology was used by the American Association for the Surgery of Trauma Committee on Severity Assessment and Patient Outcomes to review these codes and achieve consensus on the definition of primary EGS diagnosis codes. National Inpatient Sample data from 2009 were used to develop a national estimate of EGS burden of disease. Several unique ICD-9 codes were identified as primary EGS diagnoses. These encompass a wide spectrum of general surgery practice, including upper and lower gastrointestinal tract, hepatobiliary and pancreatic disease, soft tissue infections, and hernias. National Inpatient Sample estimates revealed over 4 million inpatient encounters nationally in 2009 for EGS diseases. This article provides the first list of ICD-9 diagnoses codes that define the scope of EGS based on current clinical practices. These findings have wide implications for EGS workforce training, access to care, and research.
Discriminating Projections for Estimating Face Age in Wild Images
Energy Technology Data Exchange (ETDEWEB)
Tokola, Ryan A [ORNL; Bolme, David S [ORNL; Ricanek, Karl [ORNL; Barstow, Del R [ORNL; Boehnen, Chris Bensing [ORNL
2014-01-01
We introduce a novel approach to estimating the age of a human from a single uncontrolled image. Current face age estimation algorithms work well in highly controlled images, and some are robust to changes in illumination, but it is usually assumed that images are close to frontal. This bias is clearly seen in the datasets that are commonly used to evaluate age estimation, which either entirely or mostly consist of frontal images. Using pose-specific projections, our algorithm maps image features into a pose-insensitive latent space that is discriminative with respect to age. Age estimation is then performed using a multi-class SVM. We show that our approach outperforms other published results on the Images of Groups dataset, which is the only age-related dataset with a non-trivial number of off-axis face images, and that we are competitive with recent age estimation algorithms on the mostly-frontal FG-NET dataset. We also experimentally demonstrate that our feature projections introduce insensitivity to pose.
A General Model for Cost Estimation in an Exchange
Directory of Open Access Journals (Sweden)
Benzion Barlev
2014-03-01
Full Text Available Current Generally Accepted Accounting Principles (GAAP state that the cost of an asset acquired for cash is the fair value (FV of the amount surrendered, and that of an asset acquired in a non-monetary exchange is the FV of the asset surrendered or, if it is more “clearly evident,” the FV of the acquired asset. The measurement method prescribed for a non-monetary exchange ignores valuable information about the “less clearly evident” asset. Thus, we suggest that the FV in any exchange be measured by the weighted average of the exchanged assets’ FV estimations, where the weights are the inverse of the variances’ estimations. This alternative valuation process accounts for the uncertainty involved in estimating the FV of each of the asset in the exchange. The proposed method suits all types of exchanges: monetary and non-monetary. In a monetary transaction, the weighted average equals the cash paid because the variance of its FV is nil.
Estimating large complex projects Estimando proyectos grandes y complejos
Directory of Open Access Journals (Sweden)
Cliff Schexnayder
2007-08-01
Full Text Available Managing large capital construction projects requires the coordination of a multitude of human, organizational, technical, and natural resources. Quite often, the engineering and construction complexities of such projects are overshadowed by economic, societal, and political challenges. The ramifications and effects, which result from differences between early project cost estimates and the bid price or the final project cost, are significant. Over the time span between the initiation of a project and the completion of construction many factors influence a project's final costs. This time span is normally several years in duration but for highly complex and technologically challenging projects, project duration can easily exceed a decade. Over that period, changes to the project scope often occur. The subject here is a presentation of strategies that support realistic cost estimating. Through literature review and interviews with transportation agencies in the U.S. and internationally the authors developed a database of the factors that are the root causes of cost estimation problemsGestionar proyectos de construcción de grandes capitales requiere de la coordinación de una multitud de recursos humanos, organizacionales, técnicos y naturales. Frecuentemente, las complejidades del diseño y construcción de esos grandes proyectos son tapadas por sus desafíos económicos, políticos y sociales. Las ramificaciones y efectos que resultan de las diferencias entre la estimación de costo inicial, el costo de la propuesta adjudicada y el costo final del proyecto son significativas. Hay numerosos factores que inciden en el costo final del proyecto entre su inicio y finalización. La duración es generalmente de varios años y puede incluso superar la década para aquellos especialmente complejos y desafiantes. En ese período de tiempo, cambios en los alcances del proyecto cambian frecuentemente. El tópico del presente artículo es mostrar
Projected metastable Markov processes and their estimation with observable operator models
International Nuclear Information System (INIS)
Wu, Hao; Prinz, Jan-Hendrik; Noé, Frank
2015-01-01
The determination of kinetics of high-dimensional dynamical systems, such as macromolecules, polymers, or spin systems, is a difficult and generally unsolved problem — both in simulation, where the optimal reaction coordinate(s) are generally unknown and are difficult to compute, and in experimental measurements, where only specific coordinates are observable. Markov models, or Markov state models, are widely used but suffer from the fact that the dynamics on a coarsely discretized state spaced are no longer Markovian, even if the dynamics in the full phase space are. The recently proposed projected Markov models (PMMs) are a formulation that provides a description of the kinetics on a low-dimensional projection without making the Markovianity assumption. However, as yet no general way of estimating PMMs from data has been available. Here, we show that the observed dynamics of a PMM can be exactly described by an observable operator model (OOM) and derive a PMM estimator based on the OOM learning
Methods for cost estimation in software project management
Briciu, C. V.; Filip, I.; Indries, I. I.
2016-02-01
The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.
Detection-Guided Fast Affine Projection Channel Estimator for Speech Applications
Directory of Open Access Journals (Sweden)
Yan Wu Jennifer
2007-04-01
Full Text Available In various adaptive estimation applications, such as acoustic echo cancellation within teleconferencing systems, the input signal is a highly correlated speech. This, in general, leads to extremely slow convergence of the NLMS adaptive FIR estimator. As a result, for such applications, the affine projection algorithm (APA or the low-complexity version, the fast affine projection (FAP algorithm, is commonly employed instead of the NLMS algorithm. In such applications, the signal propagation channel may have a relatively low-dimensional impulse response structure, that is, the number m of active or significant taps within the (discrete-time modelled channel impulse response is much less than the overall tap length n of the channel impulse response. For such cases, we investigate the inclusion of an active-parameter detection-guided concept within the fast affine projection FIR channel estimator. Simulation results indicate that the proposed detection-guided fast affine projection channel estimator has improved convergence speed and has lead to better steady-state performance than the standard fast affine projection channel estimator, especially in the important case of highly correlated speech input signals.
Software project effort estimation foundations and best practice guidelines for success
Trendowicz, Adam
2014-01-01
Software effort estimation is one of the oldest and most important problems in software project management, and thus today there are a large number of models, each with its own unique strengths and weaknesses in general, and even more importantly, in relation to the environment and context in which it is to be applied.Trendowicz and Jeffery present a comprehensive look at the principles of software effort estimation and support software practitioners in systematically selecting and applying the most suitable effort estimation approach. Their book not only presents what approach to take and how
Probabilistic cost estimating of nuclear power plant construction projects
International Nuclear Information System (INIS)
Finch, W.C.; Perry, L.W.; Postula, F.D.
1978-01-01
This paper shows how to identify and isolate cost accounts by developing probability trees down to component levels as justified by value and cost uncertainty. Examples are given of the procedure for assessing uncertainty in all areas contributing to cost: design, factory equipment pricing, and field labor and materials. The method of combining these individual uncertainties is presented so that the cost risk can be developed for components, systems and the total plant construction project. Formats which enable management to use the probabilistic cost estimate information for business planning and risk control are illustrated. Topics considered include code estimate performance, cost allocation, uncertainty encoding, probabilistic cost distributions, and interpretation. Effective cost control of nuclear power plant construction projects requires insight into areas of greatest cost uncertainty and a knowledge of the factors which can cause costs to vary from the single value estimates. It is concluded that probabilistic cost estimating can provide the necessary assessment of uncertainties both as to the cause and the consequences
Generalized estimators of avian abundance from count survey data
Directory of Open Access Journals (Sweden)
Royle, J. A.
2004-01-01
Full Text Available I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture-recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.
Estimate of spin polarization for PEP using generalized transformation matrices
International Nuclear Information System (INIS)
Chao, A.W.
1978-04-01
The spin polarization for PEP has been estimated before by using simplified models. The main difficulty in the previous estimates is that the strength of depolarization effects caused by various electromagnetic field errors could not be specified accurately. To overcome this difficulty, a matrix formalism for depolarization calculation was developed recently. One basic ingredient of this theory is to represent an electron by an 8-dimensional state vector, X = (x,x',y,y',z,δ,α,β) where the first six coordinates are the usual transverse and longitudinal canonical coordinates, while α and β are the two components of the electron's spin vector perpendicular to the equilibrium direction of polarization /cflx n/. The degree of depolarization is specified by 1/2(α 2 + β 2 ). The state vector X will be transformed by an 8 x 8 matrix as the electron passes through a beam-line element such as a bending magnet or an rf cavity. From any position s, one multiplies successively the 8 x 8 matrices around one revolution of the storage ring to obtain the total transformation T(s). Any impulse perturbation ΔX to the electron's state vector occurring at s will be transformed repeatedly by T(s) as the electron circulates around the storage ring. Another basic ingredient is to decompose ΔX into 8 eigenstate components with eigenvectors determined from T(s). Six of these eigenstate components corresponding to the space states will be damped out by the usual radiation damping. The projections of ΔX onto the remaining two spin eigenstates are directly related to the loss of polarization due to the impulse perturbation ΔX. Depolarization effects can thus be calculated directly once all perturbations are specified. 7 refs., 4 figs
Explicit estimating equations for semiparametric generalized linear latent variable models
Ma, Yanyuan; Genton, Marc G.
2010-01-01
which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n
Project management under uncertainty beyond beta: The generalized bicubic distribution
Directory of Open Access Journals (Sweden)
José García Pérez
2016-01-01
Full Text Available The beta distribution has traditionally been employed in the PERT methodology and generally used for modeling bounded continuous random variables based on expert’s judgment. The impossibility of estimating four parameters from the three values provided by the expert when the beta distribution is assumed to be the underlying distribution has been widely debated. This paper presents the generalized bicubic distribution as a good alternative to the beta distribution since, when the variance depends on the mode, the generalized bicubic distribution approximates the kurtosis of the Gaussian distribution better than the beta distribution. In addition, this distribution presents good properties in the PERT methodology in relation to moderation and conservatism criteria. Two empirical applications are presented to demonstrate the adequateness of this new distribution.
Some Convergence Strategies for the Alternating Generalized Projection Method
Directory of Open Access Journals (Sweden)
Maricarmen Andrade
2013-11-01
Full Text Available In this paper we extend the application of the alternating projection algorithm to solve the problem of finding a point in the intersection of $n$ sets ($n\\geq2$, which are not all of them convex sets. Here we term such method as alternating generalized projection (AGP method. In particular, we are interested in addressing the problem of avoiding the so-called trap points, which may prevent an algorithm to obtain a feasible solution in two or more sets not all convex. Some strategies that allow us to reach the feasible solution are established and conjectured. Finally, we present simple numerical results that illustrate the efficiency of the iterative methods considered.
Project Parameter Estimation on the Basis of an Erp Database
Directory of Open Access Journals (Sweden)
Relich Marcin
2013-12-01
Full Text Available Nowadays, more and more enterprises are using Enterprise Resource Planning (EPR systems that can also be used to plan and control the development of new products. In order to obtain a project schedule, certain parameters (e.g. duration have to be specified in an ERP system. These parameters can be defined by the employees according to their knowledge, or can be estimated on the basis of data from previously completed projects. This paper investigates using an ERP database to identify those variables that have a significant influence on the duration of a project phase. In the paper, a model of knowledge discovery from an ERP database is proposed. The presented method contains four stages of the knowledge discovery process such as data selection, data transformation, data mining and interpretation of patterns in the context of new product development. Among data mining techniques, a fuzzy neural system is chosen to seek relationships on the basis of data from completed projects stored in an ERP system.
Feasibility of estimating generalized extreme-value distribution of floods
International Nuclear Information System (INIS)
Ferreira de Queiroz, Manoel Moises
2004-01-01
Flood frequency analysis by generalized extreme-value probability distribution (GEV) has found increased application in recent years, given its flexibility in dealing with the three asymptotic forms of extreme distribution derived from different initial probability distributions. Estimation of higher quantiles of floods is usually accomplished by extrapolating one of the three inverse forms of GEV distribution fitted to the experimental data for return periods much higher than those actually observed. This paper studies the feasibility of fitting GEV distribution by moments of linear combinations of higher order statistics (LH moments) using synthetic annual flood series with varying characteristics and lengths. As the hydrologic events in nature such as daily discharge occur with finite values, their annual maximums are expected to follow the asymptotic form of the limited GEV distribution. Synthetic annual flood series were thus obtained from the stochastic sequences of 365 daily discharges generated by Monte Carlo simulation on the basis of limited probability distribution underlying the limited GEV distribution. The results show that parameter estimation by LH moments of this distribution, fitted to annual flood samples of less than 100-year length derived from initial limited distribution, may indicate any form of extreme-value distribution, not just the limited form as expected, and with large uncertainty in fitted parameters. A frequency analysis, on the basis of GEV distribution and LH moments, of annual flood series of lengths varying between 13 and 73 years observed at 88 gauge stations on Parana River in Brazil, indicated all the three forms of GEV distribution.(Author)
Religious affiliation at time of death - Global estimates and projections.
Skirbekk, Vegard; Todd, Megan; Stonawski, Marcin
2018-03-01
Religious affiliation influences societal practices regarding death and dying, including palliative care, religiously acceptable health service procedures, funeral rites and beliefs about an afterlife. We aimed to estimate and project religious affiliation at the time of death globally, as this information has been lacking. We compiled data on demographic information and religious affiliation from more than 2500 surveys, registers and censuses covering 198 nations/territories. We present estimates of religious affiliation at the time of death as of 2010, projections up to and including 2060, taking into account trends in mortality, religious conversion, intergenerational transmission of religion, differential fertility, and gross migration flows, by age and sex. We find that Christianity continues to be the most common religion at death, although its share will fall from 37% to 31% of global deaths between 2010 and 2060. The share of individuals identifying as Muslim at the time of death increases from 21% to 24%. The share of religiously unaffiliated will peak at 17% in 2035 followed by a slight decline thereafter. In specific regions, such as Europe, the unaffiliated share will continue to rises from 14% to 21% throughout the period. Religious affiliation at the time of death is changing globally, with distinct regional patterns. This could affect spatial variation in healthcare and social customs relating to death and dying.
International Nuclear Information System (INIS)
Mukherjee, Payel; Banerjee, Santo
2010-01-01
In this work, in the first phase, we study the phenomenon of projective synchronization in the Lorenz-Stenflo system. Synchronization is then investigated for the same system with unknown parameters. We show analytically that synchronization is possible for some proper choice of the nonlinear controller by using a suitable Lyapunov function. With the help of this result, it is also possible to estimate the values of the unknown system parameters. In the second phase as an extension of our analysis, we investigate the new hybrid projective synchronization for the same system. All our analyses are well supported with numerical evidence.
Improved measurements of RNA structure conservation with generalized centroid estimators
Directory of Open Access Journals (Sweden)
Yohei eOkada
2011-08-01
Full Text Available Identification of non-protein-coding RNAs (ncRNAs in genomes is acrucial task for not only molecular cell biology but alsobioinformatics. Secondary structures of ncRNAs are employed as a keyfeature of ncRNA analysis since biological functions of ncRNAs aredeeply related to their secondary structures. Although the minimumfree energy (MFE structure of an RNA sequence is regarded as the moststable structure, MFE alone could not be an appropriate measure foridentifying ncRNAs since the free energy is heavily biased by thenucleotide composition. Therefore, instead of MFE itself, severalalternative measures for identifying ncRNAs have been proposed such asthe structure conservation index (SCI and the base pair distance(BPD, both of which employ MFE structures. However, thesemeasurements are unfortunately not suitable for identifying ncRNAs insome cases including the genome-wide search and incur high falsediscovery rate. In this study, we propose improved measurements basedon SCI and BPD, applying generalized centroid estimators toincorporate the robustness against low quality multiple alignments.Our experiments show that our proposed methods achieve higher accuracythan the original SCI and BPD for not only human-curated structuralalignments but also low quality alignments produced by CLUSTALW. Furthermore, the centroid-based SCI on CLUSTAL W alignments is moreaccurate than or comparable with that of the original SCI onstructural alignments generated with RAF, a high quality structuralaligner, for which two-fold expensive computational time is requiredon average. We conclude that our methods are more suitable forgenome-wide alignments which are of low quality from the point of viewon secondary structures than the original SCI and BPD.
Directory of Open Access Journals (Sweden)
Yoonseok Shin
2015-01-01
Full Text Available Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project.
Abelian projection on the torus for general gauge groups
International Nuclear Information System (INIS)
Ford, C.; Tok, T.; Wipf, A.
1999-01-01
We consider Yang-Mills theories with general gauge groups G and twists of the four-torus. We find consistent boundary conditions for gauge fields in all instanton sectors. An extended abelian projection with respect to the Polyakov loop operator is presented, where A 0 is independent of time and in the Cartan subalgebra. Fundamental domains for the gauge fixed A 0 are constructed for arbitrary gauge groups. In the sectors with non-vanishing instanton number such gauge fixings are necessarily singular. The singularities can be restricted to Dirac strings joining magnetically charged defects. The magnetic charges of these monopoles take their values in the co-root lattice of the gauge group. We relate the magnetic charges of the defects and the windings of suitable Higgs fields about these defects to the instanton number
Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information
Butts, Glenn
2007-01-01
Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.
Developing milk industry estimates for dose reconstruction projects
International Nuclear Information System (INIS)
Beck, D.M.; Darwin, R.F.
1991-01-01
One of the most important contributors to radiation doses from hanford during the 1944-1947 period was radioactive iodine. Consumption of milk from cows that ate vegetation contaminated with iodine is likely the dominant pathway of human exposure. To estimate the doses people could have received from this pathway, it is necessary to reconstruct the amount of milk consumed by people living near Hanford, the source of the milk, and the type of feed that the milk cows ate. This task is challenging because the dairy industry has undergone radical changes since the end of World War 2, and records that document the impact of these changes on the study area are scarce. Similar problems are faced by researchers on most dose reconstruction efforts. The purpose of this work is to document and evaluate the methods used on the Hanford Environmental Dose Reconstruction (HEDR) Project to reconstruct the milk industry and to present preliminary results
Prediction of RNA secondary structure using generalized centroid estimators.
Hamada, Michiaki; Kiryu, Hisanori; Sato, Kengo; Mituyama, Toutai; Asai, Kiyoshi
2009-02-15
Recent studies have shown that the methods for predicting secondary structures of RNAs on the basis of posterior decoding of the base-pairing probabilities has an advantage with respect to prediction accuracy over the conventionally utilized minimum free energy methods. However, there is room for improvement in the objective functions presented in previous studies, which are maximized in the posterior decoding with respect to the accuracy measures for secondary structures. We propose novel estimators which improve the accuracy of secondary structure prediction of RNAs. The proposed estimators maximize an objective function which is the weighted sum of the expected number of the true positives and that of the true negatives of the base pairs. The proposed estimators are also improved versions of the ones used in previous works, namely CONTRAfold for secondary structure prediction from a single RNA sequence and McCaskill-MEA for common secondary structure prediction from multiple alignments of RNA sequences. We clarify the relations between the proposed estimators and the estimators presented in previous works, and theoretically show that the previous estimators include additional unnecessary terms in the evaluation measures with respect to the accuracy. Furthermore, computational experiments confirm the theoretical analysis by indicating improvement in the empirical accuracy. The proposed estimators represent extensions of the centroid estimators proposed in Ding et al. and Carvalho and Lawrence, and are applicable to a wide variety of problems in bioinformatics. Supporting information and the CentroidFold software are available online at: http://www.ncrna.org/software/centroidfold/.
Wegner estimate for sparse and other generalized alloy type potentials
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
(1) or close relatives. Moreover, all known proofs of localization in multidimensional ... The Wegner estimate is also related to the integrated density of states (IDS). ...... operator with surface potential, Rev. Math. Phys. 12(4) (2000) 561–573.
General aspects of the project organization - the suppliers' view
International Nuclear Information System (INIS)
Bogen, J.
1975-01-01
One of the major aims in the economic management of a nuclear power project is the completion of the project on scheduled time, at preditermined cost and technical performance in accordance with pre-established goals. The achievement of these goals depends, to the major extent, on the adequacy and accuracy of the project plan. The highest possibilities to save money, minimize delivery time, secure good performance are given during the periode of project planning. During project implementation the project management only should direct his activities for the realization of the pre-elaborated project plan. (orig./FW) [de
Parametric Estimation by Generalized Moment Methods for Extremes
Czech Academy of Sciences Publication Activity Database
Fabián, Zdeněk
2008-01-01
Roč. 11, č. 4 (2008), s. 26-35 ISSN 1450-7196 R&D Projects: GA MŠk ME 949 Institutional research plan: CEZ:AV0Z10300504 Keywords : heavy tails * transformation-based score Subject RIV: BB - Applied Statistics, Operational Research
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang; Huang, Jianhua Z.
2010-01-01
, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS
Directory of Open Access Journals (Sweden)
M. Pauline
2013-04-01
Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.
Executive summary and general conclusions of the rock sealing project
International Nuclear Information System (INIS)
Pusch, R.
1992-06-01
The Stripa Rock Sealing Project logically followed the two first Stripa research phases dealing with canister-embedment and plugging of excavations in repositories. The major activities in the third phase were: * Literature review and interviews for setting the state of art of rock fracture sealing. * Pilot field and lab testing applying a new effective 'dynamic' grouting technique. * Development of a general grout flow theory. * Investigation of physical properties and longevity of major candidate grouts. * Performance of 4 large-scale tests. The literature study showed that longevity aspects limited the number of potentially useful grout materials to smectitic clay and cement. The pilot testing showed that fine-grained grouts can be effectively injected in relatively fine fractures. The theoretical work led to a general grout flow theory valid both for grouting at a constant, static pressure with non-Newtonian material properties, and for 'dynamic' injection with superimposed oscillations, yielding Newtonian material behavior. The investigation of physical properties of candidate grouts with respect to hydraulic conductivity, shear strength, sensitivity to mechanical strain, as well as to chemical stability, showed that effective sealing is offered, and that any rock can have its bulk conductivity reduced to about 10 -10 m/s. The field tests comprised investigation of excavation-induced disturbance and attempts to seal disturbed rock, and in separate tests, grouting of deposition holes and a natural fine-fracture zone. Considerable disturbance of nearfield rock by blasting and stress changes, yielding an increase in axial hydraulic conductivity by 3 and 1 order of magnitude, respectively, was documented but various factors, primarily debris in the fractures, made grouting of blasted rock ineffective. Narrow fractures in deposition holes and in a natural fracture zone were sealed rather effectively. (au)
A general predictive model for estimating monthly ecosystem evapotranspiration
Ge Sun; Karrin Alstad; Jiquan Chen; Shiping Chen; Chelcy R. Ford; al. et.
2011-01-01
Accurately quantifying evapotranspiration (ET) is essential for modelling regional-scale ecosystem water balances. This study assembled an ET data set estimated from eddy flux and sapflow measurements for 13 ecosystems across a large climatic and management gradient from the United States, China, and Australia. Our objectives were to determine the relationships among...
Stability estimates for hp spectral element methods for general ...
Indian Academy of Sciences (India)
We establish basic stability estimates for a non-conforming ℎ- spectral element method which allows for simultaneous mesh refinement and variable polynomial degree. The spectral element functions are non-conforming if the boundary conditions are Dirichlet. For problems with mixed boundary conditions they are ...
Solid Waste Operations Complex W-113: Project cost estimate. Preliminary design report. Volume IV
International Nuclear Information System (INIS)
1995-01-01
This document contains Volume IV of the Preliminary Design Report for the Solid Waste Operations Complex W-113 which is the Project Cost Estimate and construction schedule. The estimate was developed based upon Title 1 material take-offs, budgetary equipment quotes and Raytheon historical in-house data. The W-113 project cost estimate and project construction schedule were integrated together to provide a resource loaded project network
Directory of Open Access Journals (Sweden)
Shangli Zhang
2009-01-01
Full Text Available By using the methods of linear algebra and matrix inequality theory, we obtain the characterization of admissible estimators in the general multivariate linear model with respect to inequality restricted parameter set. In the classes of homogeneous and general linear estimators, the necessary and suffcient conditions that the estimators of regression coeffcient function are admissible are established.
Methodology for cost estimate in projects for nuclear power plants decommissioning
International Nuclear Information System (INIS)
Salij, L.M.
2008-01-01
The conceptual approaches to cost estimating of nuclear power plants units decommissioning projects were determined. The international experience and national legislative and regulatory basis were analyzed. The possible decommissioning project cost classification was given. It was shown the role of project costs of nuclear power plant units decommissioning as the most important criterion for the main project decisions. The technical and economic estimation of deductions to common-branch fund of decommissioning projects financing was substantiated
Penalized Estimation in Large-Scale Generalized Linear Array Models
DEFF Research Database (Denmark)
Lund, Adam; Vincent, Martin; Hansen, Niels Richard
2017-01-01
Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...
Learning Theory Estimates with Observations from General Stationary Stochastic Processes.
Hang, Hanyuan; Feng, Yunlong; Steinwart, Ingo; Suykens, Johan A K
2016-12-01
This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes can be conducted and a sharp oracle inequality for generic regularized empirical risk minimization schemes can be established. The obtained oracle inequality is then applied to derive convergence rates for several learning schemes such as empirical risk minimization (ERM), least squares support vector machines (LS-SVMs) using given generic kernels, and SVMs using gaussian kernels for both least squares and quantile regression. It turns out that for independent and identically distributed (i.i.d.) processes, our learning rates for ERM recover the optimal rates. For non-i.i.d. processes, including geometrically [Formula: see text]-mixing Markov processes, geometrically [Formula: see text]-mixing processes with restricted decay, [Formula: see text]-mixing processes, and (time-reversed) geometrically [Formula: see text]-mixing processes, our learning rates for SVMs with gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates. For the remaining cases, our rates are at least close to the optimal rates. As a by-product, the assumed generalized Bernstein-type inequality also provides an interpretation of the so-called effective number of observations for various mixing processes.
The financial burden of emergency general surgery: National estimates 2010 to 2060.
Ogola, Gerald O; Gale, Stephen C; Haider, Adil; Shafi, Shahid
2015-09-01
Adoption of the acute care surgery model has led to increasing volumes of emergency general surgery (EGS) patients at trauma centers. However, the financial burden of EGS services on trauma centers is unknown. This study estimates the current and future costs associated with EGS hospitalization nationwide. We applied the American Association for the Surgery of Trauma's DRG International Classification of Diseases-9th Rev. criteria for defining EGS to the 2010 National Inpatient Sample (NIS) data and identified adult EGS patients. Cost of hospitalization was obtained by converting reported charges to cost using the 2010 all-payer inpatient cost-to-charge ratio for all hospitals in the NIS database. Cost was modeled via a log-gamma model in a generalized linear mixed model to account for potential correlation in cost within states and hospitals in the NIS database. Patients' characteristics and hospital factors were included in the model as fixed effects, while state and hospital were included as random effects. The national incidence of EGS was calculated from NIS data, and the US Census Bureau population projections were used to estimate incidence for 2010 to 2060. Nationwide costs were obtained by multiplying projected incidences by estimated costs and reported in year 2010 US dollar value. Nationwide, there were 2,640,725 adult EGS hospitalizations in 2010. The national average adjusted cost per EGS hospitalization was $10,744 (95% confidence interval [CI], $10,615-$10,874); applying these cost data to the national EGS hospitalizations gave a total estimated cost of $28.37 billion (95% CI, $28.03-$28.72 billion). Older age groups accounted for greater proportions of the cost ($8.03 billion for age ≥ 75 years, compared with $1.08 billion for age 18-24 years). As the US population continues to both grow and age, EGS costs are projected to increase by 45% to $41.20 billion (95% CI, $40.70-$41.7 billion) by 2060. EGS constitutes a significant portion of US health
Cheng, Guang; Zhou, Lan; Huang, Jianhua Z.
2014-01-01
We consider efficient estimation of the Euclidean parameters in a generalized partially linear additive models for longitudinal/clustered data when multiple covariates need to be modeled nonparametrically, and propose an estimation procedure based
Study on Top-Down Estimation Method of Software Project Planning
Institute of Scientific and Technical Information of China (English)
ZHANG Jun-guang; L(U) Ting-jie; ZHAO Yu-mei
2006-01-01
This paper studies a new software project planning method under some actual project data in order to make software project plans more effective. From the perspective of system theory, our new method regards a software project plan as an associative unit for study. During a top-down estimation of a software project, Program Evaluation and Review Technique (PERT) method and analogy method are combined to estimate its size, then effort estimation and specific schedules are obtained according to distributions of the phase effort. This allows a set of practical and feasible planning methods to be constructed. Actual data indicate that this set of methods can lead to effective software project planning.
Directory of Open Access Journals (Sweden)
Farhad Habibi
2018-09-01
Full Text Available Among different factors, correct scheduling is one of the vital elements for project management success. There are several ways to schedule projects including the Critical Path Method (CPM and Program Evaluation and Review Technique (PERT. Due to problems in estimating dura-tions of activities, these methods cannot accurately and completely model actual projects. The use of fuzzy theory is a basic way to improve scheduling and deal with such problems. Fuzzy theory approximates project scheduling models to reality by taking into account uncertainties in decision parameters and expert experience and mental models. This paper provides a step-by-step approach for accurate estimation of time and cost of projects using the Project Evaluation and Review Technique (PERT and expert views as fuzzy numbers. The proposed method included several steps. In the first step, the necessary information for project time and cost is estimated using the Critical Path Method (CPM and the Project Evaluation and Review Technique (PERT. The second step considers the duration and cost of the project activities as the trapezoidal fuzzy numbers, and then, the time and cost of the project are recalculated. The duration and cost of activities are estimated using the questionnaires as well as weighing the expert opinions, averaging and defuzzification based on a step-by-step algorithm. The calculating procedures for evaluating these methods are applied in a real project; and the obtained results are explained.
Efficient lifetime estimation techniques for general multiaxial loading
Papuga, Jan; Halama, Radim; Fusek, Martin; Rojíček, Jaroslav; Fojtík, František; Horák, David; Pecha, Marek; Tomčala, Jiří; Čermák, Martin; Hapla, Václav; Sojka, Radim; Kružík, Jakub
2017-07-01
In this paper, we discuss and present our progress toward a project, which is focused on fatigue life prediction under multiaxial loading in the domain of low-cycle fatigue, i.e. cases, where the plasticity cannot be neglected. First, the elastic-plastic solution in the finite element analysis is enhanced and verified on own experiments. Second, the method by Jiang describing the instantaneous damage increase by analyses of load time by time, is in implementation phase. In addition, simplified routines for conversion of elastic stresses-strains to elastic-plastic ones as proposed by Firat and Ye et.al. are evaluated on the basis of data gathered from external sources. In order to produce high quality complex analyses, which could be feasible in an acceptable time, and allow the period for next analyses of results to be expanded; the core of PragTic fatigue solver used for all fatigue computations are being re-implemented to get the fully parallelized scalable solution.
International Nuclear Information System (INIS)
Wu Xiangjun; Lu Hongtao
2011-01-01
Highlights: → Adaptive generalized function projective lag synchronization (AGFPLS) is proposed. → Two uncertain chaos systems are lag synchronized up to a scaling function matrix. → The synchronization speed is sensitively influenced by the control gains. → The AGFPLS scheme is robust against noise perturbation. - Abstract: In this paper, a novel projective synchronization scheme called adaptive generalized function projective lag synchronization (AGFPLS) is proposed. In the AGFPLS method, the states of two different chaotic systems with fully uncertain parameters are asymptotically lag synchronized up to a desired scaling function matrix. By means of the Lyapunov stability theory, an adaptive controller with corresponding parameter update rule is designed for achieving AGFPLS between two diverse chaotic systems and estimating the unknown parameters. This technique is employed to realize AGFPLS between uncertain Lue chaotic system and uncertain Liu chaotic system, and between Chen hyperchaotic system and Lorenz hyperchaotic system with fully uncertain parameters, respectively. Furthermore, AGFPLS between two different uncertain chaotic systems can still be achieved effectively with the existence of noise perturbation. The corresponding numerical simulations are performed to demonstrate the validity and robustness of the presented synchronization method.
Approaches in estimation of external cost for fuel cycles in the ExternE project
International Nuclear Information System (INIS)
Afanas'ev, A.A.; Maksimenko, B.N.
1998-01-01
The purposes, content and main results of studies realized within the frameworks of the International Project ExternE which is the first comprehensive attempt to develop general approach to estimation of external cost for different fuel cycles based on utilization of nuclear and fossil fuels, as well as on renewable power sources are discussed. The external cost of a fuel cycle is treated as social and environmental expenditures which are not taken into account by energy producers and consumers, i.e. these are expenditures not included into commercial cost nowadays. The conclusion on applicability of the approach suggested for estimation of population health hazards and environmental impacts connected with electric power generation growth (expressed in money or some other form) is made
A generalized model for estimating the energy density of invertebrates
James, Daniel A.; Csargo, Isak J.; Von Eschen, Aaron; Thul, Megan D.; Baker, James M.; Hayer, Cari-Ann; Howell, Jessica; Krause, Jacob; Letvin, Alex; Chipps, Steven R.
2012-01-01
Invertebrate energy density (ED) values are traditionally measured using bomb calorimetry. However, many researchers rely on a few published literature sources to obtain ED values because of time and sampling constraints on measuring ED with bomb calorimetry. Literature values often do not account for spatial or temporal variability associated with invertebrate ED. Thus, these values can be unreliable for use in models and other ecological applications. We evaluated the generality of the relationship between invertebrate ED and proportion of dry-to-wet mass (pDM). We then developed and tested a regression model to predict ED from pDM based on a taxonomically, spatially, and temporally diverse sample of invertebrates representing 28 orders in aquatic (freshwater, estuarine, and marine) and terrestrial (temperate and arid) habitats from 4 continents and 2 oceans. Samples included invertebrates collected in all seasons over the last 19 y. Evaluation of these data revealed a significant relationship between ED and pDM (r2 = 0.96, p cost savings compared to traditional bomb calorimetry approaches. This model should prove useful for a wide range of ecological studies because it is unaffected by taxonomic, seasonal, or spatial variability.
Abran, Alain
2015-01-01
Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as complex as software projects. This book presents a number of examples using data collected over the years from various organizations building software. It also presents an overview of the non-for-profit organization, which collects data on software projects, the International Software Benchmarking Stan
Poverty Mapping Project: Small Area Estimates of Poverty and Inequality
National Aeronautics and Space Administration — The Small Area Estimates of Poverty and Inequality dataset consists of consumption-based poverty, inequality and related measures for subnational administrative...
Generalized equations for estimating DXA percent fat of diverse young women and men: The Tiger Study
Popular generalized equations for estimating percent body fat (BF%) developed with cross-sectional data are biased when applied to racially/ethnically diverse populations. We developed accurate anthropometric models to estimate dual-energy x-ray absorptiometry BF% (DXA-BF%) that can be generalized t...
An Estimator of Heavy Tail Index through the Generalized Jackknife Methodology
Directory of Open Access Journals (Sweden)
Weiqi Liu
2014-01-01
Full Text Available In practice, sometimes the data can be divided into several blocks but only a few of the largest observations within each block are available to estimate the heavy tail index. To address this problem, we propose a new class of estimators through the Generalized Jackknife methodology based on Qi’s estimator (2010. These estimators are proved to be asymptotically normal under suitable conditions. Compared to Hill’s estimator and Qi’s estimator, our new estimator has better asymptotic efficiency in terms of the minimum mean squared error, for a wide range of the second order shape parameters. For the finite samples, our new estimator still compares favorably to Hill’s estimator and Qi’s estimator, providing stable sample paths as a function of the number of dividing the sample into blocks, smaller estimation bias, and MSE.
Asymptotic theory of generalized estimating equations based on jack-knife pseudo-observations
DEFF Research Database (Denmark)
Overgaard, Morten; Parner, Erik Thorlund; Pedersen, Jan
2017-01-01
A general asymptotic theory of estimates from estimating functions based on jack-knife pseudo-observations is established by requiring that the underlying estimator can be expressed as a smooth functional of the empirical distribution. Using results in p-variation norms, the theory is applied...
The generalized correlation method for estimation of time delay in power plants
International Nuclear Information System (INIS)
Kostic, Lj.
1981-01-01
The generalized correlation estimation is developed for determining time delay between signals received at two spatially separated sensors in the presence of uncorrelated noise in a power plant. This estimator can be realized as a pair of receiver prefilters followed by a cross correlator. The time argument at which the correlator achieves a maximum is the delay estimate. (author)
Penfield, Randall D.; Bergeron, Jennifer M.
2005-01-01
This article applies a weighted maximum likelihood (WML) latent trait estimator to the generalized partial credit model (GPCM). The relevant equations required to obtain the WML estimator using the Newton-Raphson algorithm are presented, and a simulation study is described that compared the properties of the WML estimator to those of the maximum…
Design management of general contractor under nuclear power project EPC mode
International Nuclear Information System (INIS)
Su Shaojian
2013-01-01
Design management has not yet formed a theoretical system recognized, general contractor design managers under nuclear power project EPC Mode lack the clear theory basis. This paper aims to discuss Design management from the angle of general contractor under nuclear power project EPC mode, Gives the concept of design management Clearly, by Combining the characteristics of nuclear power project, Gives the specific content and meaning of the design management of nuclear power project. (authors)
Improved Methodology for Benefit Estimation of Preservation Projects
2018-04-01
This research report presents an improved process for evaluating the benefits and economic tradeoffs associated with a variety of highway preservation projects. It includes a summary of results from a comprehensive phone survey concerning the use and...
Cost estimate modeling of transportation management plans for highway projects.
2012-05-01
Highway rehabilitation and reconstruction projects frequently cause road congestion and increase safety concerns while limiting access for road users. State Transportation Agencies (STAs) are challenged to find safer and more efficient ways to renew ...
Procedures and models for estimating preconstruction costs of highway projects.
2012-07-01
This study presents data driven and component based PE cost prediction models by utilizing critical factors retrieved from ten years of historical project data obtained from ODOT roadway division. The study used factor analysis of covariance and corr...
Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects
Directory of Open Access Journals (Sweden)
George Okere
2017-06-01
Full Text Available To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the engineer’s estimate and allow owners and contractors re-evaluate or affirm their reliance on the engineer’s estimate. A literature review was conducted to understand the reliance on the engineer’s estimate, and secondary data from Washington State Department of Transportation was used to investigate the reliability of the engineer’s estimate. The findings show the need for practitioners to re-evaluate their reliance on the engineer’s estimate. The empirical data showed that, within various contexts, the engineer’s estimate fell outside the expected accuracy range of the low bids or the cost to complete projects. The study recommends direct tracking of costs by project owners while projects are under construction, the use of a second estimate to improve the accuracy of their estimates, and use of the cost estimating practices found in highly reputable construction companies.
Generalized Filtered Back-Projection for Digital Breast Tomosynthesis Reconstruction
Erhard, K.; Grass, M.; Hitziger, S.; Iske, A.; Nielsen, T.
2012-01-01
Filtered back-projection (FBP) has been commonly used as an efficient and robust reconstruction technique in tomographic X-ray imagingduring the last decades. For limited angle tomography acquisitions such as digital breast tomosynthesis, however, standard FBP reconstruction algorithms provide poor
Improvement of image quality using interpolated projection data estimation method in SPECT
International Nuclear Information System (INIS)
Takaki, Akihiro; Soma, Tsutomu; Murase, Kenya; Kojima, Akihiro; Asao, Kimie; Kamada, Shinya; Matsumoto, Masanori
2009-01-01
General data acquisition for single photon emission computed tomography (SPECT) is performed in 90 or 60 directions, with a coarse pitch of approximately 4-6 deg for a rotation of 360 deg or 180 deg, using a gamma camera. No data between adjacent projections will be sampled under these circumstances. The aim of the study was to develop a method to improve SPECT image quality by generating lacking projection data through interpolation of data obtained with a coarse pitch such as 6 deg. The projection data set at each individual degree in 360 directions was generated by a weighted average interpolation method from the projection data acquired with a coarse sampling angle (interpolated projection data estimation processing method, IPDE method). The IPDE method was applied to the numerical digital phantom data, actual phantom data and clinical brain data with Tc-99m ethyle cysteinate dimer (ECD). All SPECT images were reconstructed by the filtered back-projection method and compared with the original SPECT images. The results confirmed that streak artifacts decreased by apparently increasing a sampling number in SPECT after interpolation and also improved signal-to-noise (S/N) ratio of the root mean square uncertainty value. Furthermore, the normalized mean square error values, compared with standard images, had similar ones after interpolation. Moreover, the contrast and concentration ratios increased their effects after interpolation. These results indicate that effective improvement of image quality can be expected with interpolation. Thus, image quality and the ability to depict images can be improved while maintaining the present acquisition time and image quality. In addition, this can be achieved more effectively than at present even if the acquisition time is reduced. (author)
Estimation of group means when adjusting for covariates in generalized linear models.
Qu, Yongming; Luo, Junxiang
2015-01-01
Generalized linear models are commonly used to analyze categorical data such as binary, count, and ordinal outcomes. Adjusting for important prognostic factors or baseline covariates in generalized linear models may improve the estimation efficiency. The model-based mean for a treatment group produced by most software packages estimates the response at the mean covariate, not the mean response for this treatment group for the studied population. Although this is not an issue for linear models, the model-based group mean estimates in generalized linear models could be seriously biased for the true group means. We propose a new method to estimate the group mean consistently with the corresponding variance estimation. Simulation showed the proposed method produces an unbiased estimator for the group means and provided the correct coverage probability. The proposed method was applied to analyze hypoglycemia data from clinical trials in diabetes. Copyright © 2014 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Truin Gert-Jan
2011-10-01
Full Text Available Abstract Background Considering the changes in dental healthcare, such as the increasing assertiveness of patients, the introduction of new dental professionals, and regulated competition, it becomes more important that general dental practitioners (GDPs take patients' views into account. The aim of the study was to compare patients' views on organizational aspects of general dental practices with those of GDPs and with GDPs' estimation of patients' views. Methods In a survey study, patients and GDPs provided their views on organizational aspects of a general dental practice. In a second, separate survey, GDPs were invited to estimate patients' views on 22 organizational aspects of a general dental practice. Results For 4 of the 22 aspects, patients and GDPs had the same views, and GDPs estimated patients' views reasonably well: 'Dutch-speaking GDP', 'guarantee on treatment', 'treatment by the same GDP', and 'reminder of routine oral examination'. For 2 aspects ('quality assessment' and 'accessibility for disabled patients' patients and GDPs had the same standards, although the GDPs underestimated the patients' standards. Patients had higher standards than GDPs for 7 aspects and lower standards than GDPs for 8 aspects. Conclusion On most aspects GDPs and patient have different views, except for social desirable aspects. Given the increasing assertiveness of patients, it is startling the GDP's estimated only half of the patients' views correctly. The findings of the study can assist GDPs in adapting their organizational services to better meet the preferences of their patients and to improve the communication towards patients.
Directory of Open Access Journals (Sweden)
Feraru Galina Sergeevna
2014-11-01
Full Text Available The article addresses issues characterizing features of project management contributing to their competitive advantage; shows the factors and criteria of success of projects and the main reasons for their failures, making the failed efforts of developers to create projects.
HFCs contribution to the greenhouse effect. Present and projected estimations
Energy Technology Data Exchange (ETDEWEB)
Libre, J.M.; Elf-Atochem, S.A. [Central Research & Development, Paris (France)
1997-12-31
This paper reviews data that can be used to calculate hydrofluorocarbon (HFC) contribution to the greenhouse effect and compare it to other trace gas contributions. Projections are made for 2010 and 2100 on the basis of available emission scenarios. Industrial judgement on the likelihood of those scenarios is also developed. Calculations can be made in two different ways: from Global Warming Potential weighted emissions of species or by direct calculation of radiative forcing based on measured and projected atmospheric concentrations of compounds. Results show that HFCs corresponding to commercial uses have a negligible contribution to the greenhouse effect in comparison with other trace gases. The projected contributions are also very small even if very high emission scenarios are maintained for decades. In 2010 this contribution remains below 1%. Longer term emissions projections are difficult. However, based on the IPCC scenario IS92a, in spite of huge emissions projected for the year 2100, the HFC contribution remains below 3%. Actually many factors indicate that the real UFC contribution to the greenhouse effect will be even smaller than presented here. Low emissive systems and small charges will likely improve sharply in the future and have drastically improved in the recent past. HFC technology implementation is likely to grow in the future, reach a maximum before the middle of the next century; the market will stabilise driven by recycling, closing of systems and competitive technologies. This hypothesis is supported by previous analysis of the demand for HTCs type applications which can be represented by {open_quotes}S{close_quotes} type curves and by recent analysis indicating that the level of substitution of old products by HFCs is growing slowly. On the basis of those data and best industrial judgement, the contribution of HFCs to the greenhouse effect is highly likely to remain below 1% during the next century. 11 refs., 2 figs., 5 tabs.
General consideration of domestic participation in nuclear power projects
International Nuclear Information System (INIS)
Csik, B.J.
1975-01-01
No nuclear power plant can be supplied entirely by outside sources. Some degree of domestic participation is always required and this can be increased if desired. There is no general rule as to what can and should be supplied locally; different countries have different characteristics, conditions and possibilities, and these define the scope of what is to be done. (orig./FW) [de
DEFF Research Database (Denmark)
Madsen, Henrik; Rosbjerg, Dan
1997-01-01
parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified......A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...
Directory of Open Access Journals (Sweden)
Nicholas Scott Cardell
2013-05-01
Full Text Available Maximum entropy methods of parameter estimation are appealing because they impose no additional structure on the data, other than that explicitly assumed by the analyst. In this paper we prove that the data constrained GME estimator of the general linear model is consistent and asymptotically normal. The approach we take in establishing the asymptotic properties concomitantly identifies a new computationally efficient method for calculating GME estimates. Formulae are developed to compute asymptotic variances and to perform Wald, likelihood ratio, and Lagrangian multiplier statistical tests on model parameters. Monte Carlo simulations are provided to assess the performance of the GME estimator in both large and small sample situations. Furthermore, we extend our results to maximum cross-entropy estimators and indicate a variant of the GME estimator that is unbiased. Finally, we discuss the relationship of GME estimators to Bayesian estimators, pointing out the conditions under which an unbiased GME estimator would be efficient.
Generalized Born-Infeld actions and projective cubic curves
Energy Technology Data Exchange (ETDEWEB)
Ferrara, S. [Department of Physics, CERN Theory Division, CH - 1211 Geneva 23 (Switzerland); INFN - Laboratori Nazionali di Frascati, Via Enrico Fermi 40, I-00044, Frascati (Italy); Porrati, M. [CCPP, Department of Physics, NYU, 4 Washington Pl., New York, NY, 10003 (United States); Sagnotti, A. [Department of Physics, CERN Theory Division, CH - 1211 Geneva 23 (Switzerland); Stora, R. [Department of Physics, CERN Theory Division, CH - 1211 Geneva 23 (Switzerland); Laboratoire d' Annecy-le-Vieux de Physique Theorique (LAPTH), F-74941, Annecy-le-Vieux, Cedex (France); Yeranyan, A. [INFN - Laboratori Nazionali di Frascati, Via Enrico Fermi 40, I-00044, Frascati (Italy); Centro Studi e Ricerche Enrico Fermi, Via Panisperna 89A, 00184, Roma (Italy)
2015-04-01
We investigate U(1){sup n} supersymmetric Born-Infeld Lagrangians with a second non-linearly realized supersymmetry. The resulting non-linear structure is more complex than the square root present in the standard Born-Infeld action, and nonetheless the quadratic constraints determining these models can be solved exactly in all cases containing three vector multiplets. The corresponding models are classified by cubic holomorphic prepotentials. Their symmetry structures are associated to projective cubic varieties. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
General presentation of the EC FP7 SARGEN IV project
International Nuclear Information System (INIS)
Blanc, Daniel
2012-01-01
SARGEN I V project: Innovative reactors with fast spectrum to be built in Europe. The ESNII roadmap includes: • the ASTRID SFR prototype supported by the French Government; • ALFRED LFR demonstrator; → Romania is interested to host the ALFRED prototype. • the ALLEGRO GFR prototype → A Memorandum of Understanding (MOU) has been signed in May 2010 between UJV (Czech Republic), AEKI (Hungary) and VUJE (Slovakia) to develop ALLEGRO. • the MYRRHA experimental facility supported by the Belgian government is able to: → test the LFR technology; → provide irradiation in the fast spectrum area; → test the Accelerator Driven System (ADS) technology (not retained by the GIF)
10 CFR 603.560 - Estimate of project expenditures.
2010-01-01
... the recipient achieves them, are reliable indicators of the amount of effort the recipient expended. However, the estimate of the required expenditures need not be a precise dollar amount, as illustrated by the example in paragraph (b) of this section, if: (1) The recipient is contributing a substantial...
Zhao, L. W.; Du, J. G.; Yin, J. L.
2018-05-01
This paper proposes a novel secured communication scheme in a chaotic system by applying generalized function projective synchronization of the nonlinear Schrödinger equation. This phenomenal approach guarantees a secured and convenient communication. Our study applied the Melnikov theorem with an active control strategy to suppress chaos in the system. The transmitted information signal is modulated into the parameter of the nonlinear Schrödinger equation in the transmitter and it is assumed that the parameter of the receiver system is unknown. Based on the Lyapunov stability theory and the adaptive control technique, the controllers are designed to make two identical nonlinear Schrödinger equation with the unknown parameter asymptotically synchronized. The numerical simulation results of our study confirmed the validity, effectiveness and the feasibility of the proposed novel synchronization method and error estimate for a secure communication. The Chaos masking signals of the information communication scheme, further guaranteed a safer and secured information communicated via this approach.
Error estimates in projective solutions of the radon equation
International Nuclear Information System (INIS)
Lubuma, M.S.
1991-04-01
The model Radon equation is the integral equation of the second kind defined by the interior limits of the electrostatic double layer potential relative to a curve with one angular point and characterized by the non compactness of the operator with respect to the maximum norm. It is shown that the solution to this equation is decomposable into a regular part and a finite linear combination of intrinsic singular functions. The maximal regularity of the solution and explicit formulae for the coefficients of the singular functions are given. The regularity permits to specify how slow the convergence of the classical projection method is, while the above mentioned formulae lead to modified projection methods of the Dual Singular Function Method type, with better approximations for the solution and for the coefficients of singularities. (author). 23 refs
The generalized hedgehog and the projected chiral soliton model
International Nuclear Information System (INIS)
Fiolhais, M.; Kernforschungsanlage Juelich G.m.b.H.; Goeke, K.; Bochum Univ.; Gruemmer, F.; Urbano, J.N.
1988-01-01
The linear chiral soliton model with quark fields and elementary pion and sigma fields is solved in order to describe static properties of the nucleon and the delta resonance. To this end a Fock state of the system is constructed which consists of three valence quarks in a 1s orbit with a generalized hedgehog spin-flavour configuration cosηvertical strokeu↓> - sin ηvertical stroked↑>. Coherent states are used to provide a quantum description for the mesonic parts of the total wave function. The corresponding classical pion field also exhibits a generalized hedgehog structure. Various nucleon properties are calculated. These include proton and neutron charge raii, and the mangnetic moment of the proton for which experiment is obtained. (orig./HSI)
General safety orientations of the Jules Horowitz Reactor Project (JHRP)
International Nuclear Information System (INIS)
Tremodeux, P.; Fiorini, G.L.
2000-01-01
After a brief reminder of the JHR purpose, the document outlines the General Safety related Orientations/Recommendations used for the design and the safety assessment of the facility. As far as the JHR design is new, the safety philosophy adopted for this reactor will be as consistent as possible with that recommended for future (power...) reactors. The general recommendations developed in the paper are: the general nuclear safety approach for the design, operation and analysis with, in particular, the adoption of the Defence In Depth principle; the general safety objectives in terms of radiological consequences; the use of Probabilistic Safety Studies; quality assurance. The 'Defence in Depth' concept using amongst others the 'Barrier' principle remains the basis of the JHR safety. 'Defence In Depth' is applied both to design and operation. Its adequacy is checked during the safety assessment and the paper gives the technical recommendations that should allow the designer to implement this concept into the final design. Built mainly for experimental irradiation the JHR facilities will be handled according to conventional or new operation rules which could put materials under stress and entail handling errors. Specific recommendations are defined to take into account the corresponding peculiarities; they are discussed in the paper. The safety design of the JHR takes into account the experience accumulated through the CEA experimental irradiation programmes, which represents several dozen reactor years; the consultation of CEA reactor facilities operators is ongoing. The corresponding feedback is shortly described. Recommendations related to maintenance and associated operation are indicated as well as those regarding the human factor. Details are given on the JHR safety practical implementation through the CEA/DRN Safety approach. Details of the corresponding Safety Objectives are also discussed. Finally the designer position on the role of probabilistic safety
78 FR 61227 - Public Assistance Cost Estimating Format for Large Projects
2013-10-03
... equipment. The base cost (construction costs) plus nonconstruction costs equal the total eligible cost... included the estimated base cost plus the estimated nonconstruction costs. Under the traditional method... total cost of completing the project. This ``forward- pricing'' methodology provides an estimate of the...
Doubly robust estimation of generalized partial linear models for longitudinal data with dropouts.
Lin, Huiming; Fu, Bo; Qin, Guoyou; Zhu, Zhongyi
2017-12-01
We develop a doubly robust estimation of generalized partial linear models for longitudinal data with dropouts. Our method extends the highly efficient aggregate unbiased estimating function approach proposed in Qu et al. (2010) to a doubly robust one in the sense that under missing at random (MAR), our estimator is consistent when either the linear conditional mean condition is satisfied or a model for the dropout process is correctly specified. We begin with a generalized linear model for the marginal mean, and then move forward to a generalized partial linear model, allowing for nonparametric covariate effect by using the regression spline smoothing approximation. We establish the asymptotic theory for the proposed method and use simulation studies to compare its finite sample performance with that of Qu's method, the complete-case generalized estimating equation (GEE) and the inverse-probability weighted GEE. The proposed method is finally illustrated using data from a longitudinal cohort study. © 2017, The International Biometric Society.
General concept and present status of the AMS - project
International Nuclear Information System (INIS)
Stan-Sion, C.; Plostinaru, D.; Catana, L.; Radulescu, M.; Marinescu, L.; Dima, R.
1998-01-01
The Institute of Nuclear Physics and Engineering, IFIN-HH, Bucharest started in 1996 the Construction Project for an AMS facility in Bucharest. The Project is supported by the German-Romanian scientific cooperation project RUM-013-97. In the frame of this project, scientific research, construction activities of electronic and mechanical devices and activities for calibration and optimization were performed. The activities in this year are concentrated on experimental tests and optimization of the AMS analysis procedure and on improving the tandem acceleration and transmission capability. These are as follows: - construction and tests of a high current sputter source, with spherical ionizer and automatic many-sample changer. Experimental tests and measurements of emittance and beam profile; - construction of the Injector Platform; - construction and tests of a Bragg - charged particle detector; - calibration and optimization experiments of the AMS ensemble. The AMS facility is based on the 8 MV - FN tandem accelerator and has the following main components: the AMS ion injector, a Wien - velocity filter, a 60 angle bending magnet and the particle detection system. For heavy particle detection we constructed a Bragg ionization chamber. This should be used for standard AMS measurements. A specific part of the AMS facility represents the AMS injector deck. The main components are: the high current sputter source, the analyzing double focusing magnet, two optical systems, a four-slit beam defining system in front of the analyzing magnet and a four-slit aperture with remote controlled Faraday cup (for beam current integration of stable isotopes) after the magnet. Finally, the pre-acceleration tube (40 kV) connects the ensemble to the ground potential of the accelerator beam line. For AMS measurements, the sputter ion source is able to provide both high beam current and small beam emittance. These characteristics are necessary in order to achieve high analyzing
A projection and density estimation method for knowledge discovery.
Directory of Open Access Journals (Sweden)
Adam Stanski
Full Text Available A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.
Estimation of the Net Present Value of the Investment Project in the Situation of Fuzzy Initial Data
Directory of Open Access Journals (Sweden)
Kotsyuba Oleksiy S.
2017-03-01
Full Text Available The article investigates the problem of estimating the net present value of the investment project using a methodology based on the theory of fuzzy sets in the situation when the initial data are described by fuzzy estimates. In the general case the fuzzy-multiple estimation of the specified indicator at a discrete-interval representation of the initial parameters is reduced to a set of homogeneous optimization problems. Often, depending on the characteristics of fuzzy estimates of cash flows of the investment project under consideration, the solutions to these problems can be found directly on the basis of relevant analytical expressions, while other problems require using special optimization methods. In the work there made an attempt to develop the analytical component of the fuzzy-multiple modeling of the net present value indicator. First, we examined the general search and optimization approach, which allows providing a predetermined degree of accuracy, as well as the method for approximate determination of the fuzzy estimate of the net present value on the basis of analytical relationships developed by Сhui-Yu Chiu and Chan S. Park. After that, the situation was analyzed, and the corresponding calculation model was proposed, when fuzzy estimates of the cash flows of the investment project can be interpreted from the perspective of the concept of the money-generating operation formulated by O. B. Lozhkin. Among other things, it allowed to develop a general scheme for determining the fuzzy estimate of the net present value, supplementing it with the situation of this concept. As the main direction of the further development of the problems discussed in the publication there determined the formation of a holistic methodology for evaluating the effectiveness of real investments, which would cover different in their nature and structural characteristics types of uncertainty from unified theoretical positions.
ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE
Energy Technology Data Exchange (ETDEWEB)
Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre
2011-06-22
This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.
Westgate, Philip M
2013-07-20
Generalized estimating equations (GEEs) are routinely used for the marginal analysis of correlated data. The efficiency of GEE depends on how closely the working covariance structure resembles the true structure, and therefore accurate modeling of the working correlation of the data is important. A popular approach is the use of an unstructured working correlation matrix, as it is not as restrictive as simpler structures such as exchangeable and AR-1 and thus can theoretically improve efficiency. However, because of the potential for having to estimate a large number of correlation parameters, variances of regression parameter estimates can be larger than theoretically expected when utilizing the unstructured working correlation matrix. Therefore, standard error estimates can be negatively biased. To account for this additional finite-sample variability, we derive a bias correction that can be applied to typical estimators of the covariance matrix of parameter estimates. Via simulation and in application to a longitudinal study, we show that our proposed correction improves standard error estimation and statistical inference. Copyright © 2012 John Wiley & Sons, Ltd.
A case-based reasoning approach for estimating the costs of pump station projects
Directory of Open Access Journals (Sweden)
Mohamed M. Marzouk
2011-10-01
Full Text Available The effective estimation of costs is crucial to the success of construction projects. Cost estimates are used to evaluate, approve and/or fund projects. Organizations use some form of classification system to identify the various types of estimates that may be prepared during the lifecycle of a project. This research presents a parametric-cost model for pump station projects. Fourteen factors have been identified as important to the influence of the cost of pump station projects. A data set that consists of forty-four pump station projects (fifteen water and twenty-nine waste water are collected to build a Case-Based Reasoning (CBR library and to test its performance. The results obtained from the CBR tool are processed and adopted to improve the accuracy of the results. A numerical example is presented to demonstrate the development of the effectiveness of the tool.
Estimating unbiased economies of scale of HIV prevention projects: a case study of Avahan.
Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudha; Blanc, Elodie; Le Nestour, Alexis
2015-04-01
Governments and donors are investing considerable resources on HIV prevention in order to scale up these services rapidly. Given the current economic climate, providers of HIV prevention services increasingly need to demonstrate that these investments offer good 'value for money'. One of the primary routes to achieve efficiency is to take advantage of economies of scale (a reduction in the average cost of a health service as provision scales-up), yet empirical evidence on economies of scale is scarce. Methodologically, the estimation of economies of scale is hampered by several statistical issues preventing causal inference and thus making the estimation of economies of scale complex. In order to estimate unbiased economies of scale when scaling up HIV prevention services, we apply our analysis to one of the few HIV prevention programmes globally delivered at a large scale: the Indian Avahan initiative. We costed the project by collecting data from the 138 Avahan NGOs and the supporting partners in the first four years of its scale-up, between 2004 and 2007. We develop a parsimonious empirical model and apply a system Generalized Method of Moments (GMM) and fixed-effects Instrumental Variable (IV) estimators to estimate unbiased economies of scale. At the programme level, we find that, after controlling for the endogeneity of scale, the scale-up of Avahan has generated high economies of scale. Our findings suggest that average cost reductions per person reached are achievable when scaling-up HIV prevention in low and middle income countries. Copyright © 2015 Elsevier Ltd. All rights reserved.
SEffEst: Effort estimation in software projects using fuzzy logic and neural networks
Directory of Open Access Journals (Sweden)
Israel
2012-08-01
Full Text Available Academia and practitioners confirm that software project effort prediction is crucial for an accurate software project management. However, software development effort estimation is uncertain by nature. Literature has developed methods to improve estimation correctness, using artificial intelligence techniques in many cases. Following this path, this paper presents SEffEst, a framework based on fuzzy logic and neural networks designed to increase effort estimation accuracy on software development projects. Trained using ISBSG data, SEffEst presents remarkable results in terms of prediction accuracy.
Zengmei, L.; Guanghua, Q.; Zishen, C.
2015-05-01
The direct benefit of a waterlogging control project is reflected by the reduction or avoidance of waterlogging loss. Before and after the construction of a waterlogging control project, the disaster-inducing environment in the waterlogging-prone zone is generally different. In addition, the category, quantity and spatial distribution of the disaster-bearing bodies are also changed more or less. Therefore, under the changing environment, the direct benefit of a waterlogging control project should be the reduction of waterlogging losses compared to conditions with no control project. Moreover, the waterlogging losses with or without the project should be the mathematical expectations of the waterlogging losses when rainstorms of all frequencies meet various water levels in the drainage-accepting zone. So an estimation model of the direct benefit of waterlogging control is proposed. Firstly, on the basis of a Copula function, the joint distribution of the rainstorms and the water levels are established, so as to obtain their joint probability density function. Secondly, according to the two-dimensional joint probability density distribution, the dimensional domain of integration is determined, which is then divided into small domains so as to calculate the probability for each of the small domains and the difference between the average waterlogging loss with and without a waterlogging control project, called the regional benefit of waterlogging control project, under the condition that rainstorms in the waterlogging-prone zone meet the water level in the drainage-accepting zone. Finally, it calculates the weighted mean of the project benefit of all small domains, with probability as the weight, and gets the benefit of the waterlogging control project. Taking the estimation of benefit of a waterlogging control project in Yangshan County, Guangdong Province, as an example, the paper briefly explains the procedures in waterlogging control project benefit estimation. The
Kernel and wavelet density estimators on manifolds and more general metric spaces
DEFF Research Database (Denmark)
Cleanthous, G.; Georgiadis, Athanasios; Kerkyacharian, G.
We consider the problem of estimating the density of observations taking values in classical or nonclassical spaces such as manifolds and more general metric spaces. Our setting is quite general but also sufficiently rich in allowing the development of smooth functional calculus with well localized...... spectral kernels, Besov regularity spaces, and wavelet type systems. Kernel and both linear and nonlinear wavelet density estimators are introduced and studied. Convergence rates for these estimators are established, which are analogous to the existing results in the classical setting of real...
International Nuclear Information System (INIS)
Zeng, G.L.; Gullberg, G.T.
1995-01-01
It is common practice to estimate kinetic parameters from dynamically acquired tomographic data by first reconstructing a dynamic sequence of three-dimensional reconstructions and then fitting the parameters to time activity curves generated from the time-varying reconstructed images. However, in SPECT, the pharmaceutical distribution can change during the acquisition of a complete tomographic data set, which can bias the estimated kinetic parameters. It is hypothesized that more accurate estimates of the kinetic parameters can be obtained by fitting to the projection measurements instead of the reconstructed time sequence. Estimation from projections requires the knowledge of their relationship between the tissue regions of interest or voxels with particular kinetic parameters and the project measurements, which results in a complicated nonlinear estimation problem with a series of exponential factors with multiplicative coefficients. A technique is presented in this paper where the exponential decay parameters are estimated separately using linear time-invariant system theory. Once the exponential factors are known, the coefficients of the exponentials can be estimated using linear estimation techniques. Computer simulations demonstrate that estimation of the kinetic parameters directly from the projections is more accurate than the estimation from the reconstructed images
Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects
Okere, George
2017-01-01
To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the e...
Estimation of unknown nuclear masses by means of the generalized mass relations. Pt. 3
International Nuclear Information System (INIS)
Popa, S.M.
1980-01-01
A survey of the estimations of the unknown nuclear masses by means of the generalized mass relations is presented. One discusses the new hypotheses supplementing the original general Garvey-Kelson scheme, reviewing the generalized mass relations and formulae, according to the present status of this new formalism. A critical discussions is given of the reliability of these new Garvey-Kelson type extrapolation procedures. (author)
Galili, Tal; Meilijson, Isaac
2016-01-02
The Rao-Blackwell theorem offers a procedure for converting a crude unbiased estimator of a parameter θ into a "better" one, in fact unique and optimal if the improvement is based on a minimal sufficient statistic that is complete. In contrast, behind every minimal sufficient statistic that is not complete, there is an improvable Rao-Blackwell improvement. This is illustrated via a simple example based on the uniform distribution, in which a rather natural Rao-Blackwell improvement is uniformly improvable. Furthermore, in this example the maximum likelihood estimator is inefficient, and an unbiased generalized Bayes estimator performs exceptionally well. Counterexamples of this sort can be useful didactic tools for explaining the true nature of a methodology and possible consequences when some of the assumptions are violated. [Received December 2014. Revised September 2015.].
Energy Technology Data Exchange (ETDEWEB)
Casey, Daniel
1984-10-01
This assessment addresses the impacts to the wildlife populations and wildlife habitats due to the Hungry Horse Dam project on the South Fork of the Flathead River and previous mitigation of theses losses. In order to develop and focus mitigation efforts, it was first necessary to estimate wildlife and wildlife hatitat losses attributable to the construction and operation of the project. The purpose of this report was to document the best available information concerning the degree of impacts to target wildlife species. Indirect benefits to wildlife species not listed will be identified during the development of alternative mitigation measures. Wildlife species incurring positive impacts attributable to the project were identified.
Specifying general activity clusters for ERP projects aimed at effort prediction
Janssens, G.; Kusters, R.J.; Heemstra, F.J.; Gunasekaran, A.; Shea, T.
2010-01-01
ERP implementation projects affect large parts of an implementing organization and lead to changes in the way an organization performs its tasks. The costs needed for the effort to implement these systems are hard to estimate. Research indicates that the size of an ERP project can be a useful
Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang
2010-07-01
We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.
2012-12-01
Estimates of value of time (VOT) and value of travel time savings (VTTS) are critical elements in benefitcost : analyses of transportation projects and in developing congestion pricing policies. In addition, : differences in VTTS among various modes ...
Cost estimation using ministerial regulation of public work no. 11/2013 in construction projects
Arumsari, Putri; Juliastuti; Khalifah Al'farisi, Muhammad
2017-12-01
One of the first tasks in starting a construction project is to estimate the total cost of building a project. In Indonesia there are several standards that are used to calculate the cost estimation of a project. One of the standards used in based on the Ministerial Regulation of Public Work No. 11/2013. However in a construction project, contractor often has their own cost estimation based on their own calculation. This research aimed to compare the construction project total cost using calculation based on the Ministerial Regulation of Public Work No. 11/2013 against the contractor’s calculation. Two projects were used as case study to compare the results. The projects were a 4 storey building located in Pantai Indah Kapuk area (West Jakarta) and a warehouse located in Sentul (West Java) which was built by 2 different contractors. The cost estimation from both contractors’ calculation were compared to the one based on the Ministerial Regulation of Public Work No. 11/2013. It is found that there were differences between the two calculation around 1.80 % - 3.03% in total cost, in which the cost estimation based on Ministerial Regulation was higher than the contractors’ calculations.
Integrating Portfolio Management and Simulation Concepts in the ERP Project Estimation Practice
Daneva, Maia; Paech, B.; Rolland, C
2008-01-01
This paper presents a two-site case study on requirements-based effort estimation practices in enterprise resource planning projects. Specifically, the case study investigated the question of how to handle qualitative data and highly volatile values of project context characteristics. We counterpart
Directory of Open Access Journals (Sweden)
Akanda Md. Abdus Salam
2017-03-01
Full Text Available Individual heterogeneity in capture probabilities and time dependence are fundamentally important for estimating the closed animal population parameters in capture-recapture studies. A generalized estimating equations (GEE approach accounts for linear correlation among capture-recapture occasions, and individual heterogeneity in capture probabilities in a closed population capture-recapture individual heterogeneity and time variation model. The estimated capture probabilities are used to estimate animal population parameters. Two real data sets are used for illustrative purposes. A simulation study is carried out to assess the performance of the GEE estimator. A Quasi-Likelihood Information Criterion (QIC is applied for the selection of the best fitting model. This approach performs well when the estimated population parameters depend on the individual heterogeneity and the nature of linear correlation among capture-recapture occasions.
Cheng, Guang
2014-02-01
We consider efficient estimation of the Euclidean parameters in a generalized partially linear additive models for longitudinal/clustered data when multiple covariates need to be modeled nonparametrically, and propose an estimation procedure based on a spline approximation of the nonparametric part of the model and the generalized estimating equations (GEE). Although the model in consideration is natural and useful in many practical applications, the literature on this model is very limited because of challenges in dealing with dependent data for nonparametric additive models. We show that the proposed estimators are consistent and asymptotically normal even if the covariance structure is misspecified. An explicit consistent estimate of the asymptotic variance is also provided. Moreover, we derive the semiparametric efficiency score and information bound under general moment conditions. By showing that our estimators achieve the semiparametric information bound, we effectively establish their efficiency in a stronger sense than what is typically considered for GEE. The derivation of our asymptotic results relies heavily on the empirical processes tools that we develop for the longitudinal/clustered data. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2014 ISI/BS.
Generalized projective synchronization of two coupled complex networks of different sizes
International Nuclear Information System (INIS)
Li Ke-Zan; He En; Zeng Zhao-Rong; Chi, K. Tse
2013-01-01
We investigate a new generalized projective synchronization between two complex dynamical networks of different sizes. To the best of our knowledge, most of the current studies on projective synchronization have dealt with coupled networks of the same size. By generalized projective synchronization, we mean that the states of the nodes in each network can realize complete synchronization, and the states of a pair of nodes from both networks can achieve projective synchronization. Using the stability theory of the dynamical system, several sufficient conditions for guaranteeing the existence of the generalized projective synchronization under feedback control and adaptive control are obtained. As an example, we use Chua's circuits to demonstrate the effectiveness of our proposed approach
GENERAL ALGORITHMIC SCHEMA OF THE PROCESS OF THE CHILL AUXILIARIES PROJECTION
Directory of Open Access Journals (Sweden)
A. N. Chichko
2006-01-01
Full Text Available The general algorithmic diagram of systematization of the existing approaches to the process of projection is offered and the foundation of computer system of the chill mold arming construction is laid.
Directory of Open Access Journals (Sweden)
Nicola Koper
2012-03-01
Full Text Available Resource selection functions (RSF are often developed using satellite (ARGOS or Global Positioning System (GPS telemetry datasets, which provide a large amount of highly correlated data. We discuss and compare the use of generalized linear mixed-effects models (GLMM and generalized estimating equations (GEE for using this type of data to develop RSFs. GLMMs directly model differences among caribou, while GEEs depend on an adjustment of the standard error to compensate for correlation of data points within individuals. Empirical standard errors, rather than model-based standard errors, must be used with either GLMMs or GEEs when developing RSFs. There are several important differences between these approaches; in particular, GLMMs are best for producing parameter estimates that predict how management might influence individuals, while GEEs are best for predicting how management might influence populations. As the interpretation, value, and statistical significance of both types of parameter estimates differ, it is important that users select the appropriate analytical method. We also outline the use of k-fold cross validation to assess fit of these models. Both GLMMs and GEEs hold promise for developing RSFs as long as they are used appropriately.
International Nuclear Information System (INIS)
Li Liang; Chen Zhiqiang; Xing Yuxiang; Zhang Li; Kang Kejun; Wang Ge
2006-01-01
In recent years, image reconstruction methods for cone-beam computed tomography (CT) have been extensively studied. However, few of these studies discussed computing parallel-beam projections from cone-beam projections. In this paper, we focus on the exact synthesis of complete or incomplete parallel-beam projections from cone-beam projections. First, an extended central slice theorem is described to establish a relationship between the Radon space and the Fourier space. Then, data sufficiency conditions are proposed for computing parallel-beam projection data from cone-beam data. Using these results, a general filtered backprojection algorithm is formulated that can exactly synthesize parallel-beam projection data from cone-beam projection data. As an example, we prove that parallel-beam projections can be exactly synthesized in an angular range in the case of circular cone-beam scanning. Interestingly, this angular range is larger than that derived in the Feldkamp reconstruction framework. Numerical experiments are performed in the circular scanning case to verify our method
DOTD support for UTC project : travel time estimation using bluetooth, [research project capsule].
2013-10-01
Travel time estimates are useful tools for measuring congestion in an urban area. Current : practice involves using probe vehicles or video cameras to measure travel time, but this is a laborintensive and expensive means of obtaining the information....
The USL NASA PC R and D project: General specifications of objectives
Dominick, Wayne D. (Editor)
1984-01-01
Given here are the general specifications of the objectives of the University of Southwestern Louisiana Data Base Management System (USL/DBMS) NASA PC R and D Project, a project initiated to address future R and D issues related to PC-based processing environments acquired pursuant to the NASA contract work; namely, the IBM PC/XT systems.
Bulk material management mode of general contractors in nuclear power project
International Nuclear Information System (INIS)
Zhang Jinyong; Zhao Xiaobo
2011-01-01
The paper introduces the characteristics of bulk material management mode in construction project, and the advantages and disadvantages of bulk material management mode of general contractors in nuclear power project. In combination with the bulk material management mode of China Nuclear Power Engineering Co., Ltd, some improvement measures have been put forward as well. (authors)
International Nuclear Information System (INIS)
Ruehrnschopf and, Ernst-Peter; Klingenbeck, Klaus
2011-01-01
The main components of scatter correction procedures are scatter estimation and a scatter compensation algorithm. This paper completes a previous paper where a general framework for scatter compensation was presented under the prerequisite that a scatter estimation method is already available. In the current paper, the authors give a systematic review of the variety of scatter estimation approaches. Scatter estimation methods are based on measurements, mathematical-physical models, or combinations of both. For completeness they present an overview of measurement-based methods, but the main topic is the theoretically more demanding models, as analytical, Monte-Carlo, and hybrid models. Further classifications are 3D image-based and 2D projection-based approaches. The authors present a system-theoretic framework, which allows to proceed top-down from a general 3D formulation, by successive approximations, to efficient 2D approaches. A widely useful method is the beam-scatter-kernel superposition approach. Together with the review of standard methods, the authors discuss their limitations and how to take into account the issues of object dependency, spatial variance, deformation of scatter kernels, external and internal absorbers. Open questions for further investigations are indicated. Finally, the authors refer on some special issues and applications, such as bow-tie filter, offset detector, truncated data, and dual-source CT.
Koch, Stefan; Mitlöhner, Johann
2010-08-01
ERP implementation projects have received enormous attention in the last years, due to their importance for organisations, as well as the costs and risks involved. The estimation of effort and costs associated with new projects therefore is an important topic. Unfortunately, there is still a lack of models that can cope with the special characteristics of these projects. As the main focus lies in adapting and customising a complex system, and even changing the organisation, traditional models like COCOMO can not easily be applied. In this article, we will apply effort estimation based on social choice in this context. Social choice deals with aggregating the preferences of a number of voters into a collective preference, and we will apply this idea by substituting the voters by project attributes. Therefore, instead of supplying numeric values for various project attributes, a new project only needs to be placed into rankings per attribute, necessitating only ordinal values, and the resulting aggregate ranking can be used to derive an estimation. We will describe the estimation process using a data set of 39 projects, and compare the results to other approaches proposed in the literature.
Generalized allometric regression to estimate biomass of Populus in short-rotation coppice
Energy Technology Data Exchange (ETDEWEB)
Ben Brahim, Mohammed; Gavaland, Andre; Cabanettes, Alain [INRA Centre de Toulouse, Castanet-Tolosane Cedex (France). Unite Agroforesterie et Foret Paysanne
2000-07-01
Data from four different stands were combined to establish a single generalized allometric equation to estimate above-ground biomass of individual Populus trees grown on short-rotation coppice. The generalized model was performed using diameter at breast height, the mean diameter and the mean height of each site as dependent variables and then compared with the stand-specific regressions using F-test. Results showed that this single regression estimates tree biomass well at each stand and does not introduce bias with increasing diameter.
Prediction of Monte Carlo errors by a theory generalized to treat track-length estimators
International Nuclear Information System (INIS)
Booth, T.E.; Amster, H.J.
1978-01-01
Present theories for predicting expected Monte Carlo errors in neutron transport calculations apply to estimates of flux-weighted integrals sampled directly by scoring individual collisions. To treat track-length estimators, the recent theory of Amster and Djomehri is generalized to allow the score distribution functions to depend on the coordinates of two successive collisions. It has long been known that the expected track length in a region of phase space equals the expected flux integrated over that region, but that the expected statistical error of the Monte Carlo estimate of the track length is different from that of the flux integral obtained by sampling the sum of the reciprocals of the cross sections for all collisions in the region. These conclusions are shown to be implied by the generalized theory, which provides explicit equations for the expected values and errors of both types of estimators. Sampling expected contributions to the track-length estimator is also treated. Other general properties of the errors for both estimators are derived from the equations and physically interpreted. The actual values of these errors are then obtained and interpreted for a simple specific example
Non-parametric estimation of the availability in a general repairable system
International Nuclear Information System (INIS)
Gamiz, M.L.; Roman, Y.
2008-01-01
This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform
Non-parametric estimation of the availability in a general repairable system
Energy Technology Data Exchange (ETDEWEB)
Gamiz, M.L. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)], E-mail: mgamiz@ugr.es; Roman, Y. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)
2008-08-15
This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform.
Definition of the generalized criterion of estimation of ecological purity of textile products
International Nuclear Information System (INIS)
Gintibidze, N.; Valishvili, T.
2009-01-01
One of actual problems is the estimation of hygienic and ecological properties of fabrics on the basis of the data on the properties of initial fiber. In the present article, the definition of generalized criterion of the estimation of ecological purity of textile products is discussed. The estimation is based on the International Standard EKO-TEX-100, regulating the contents of inorganic and organic compounds in textile production. The determination of all listed substances is made according to appropriate techniques for each parameter. The quantity of substances is determined and compared with norms. The judgement about ecological purity is made by separate parameters. There is no uniform parameter which could estimate the degree of ecological purity of textile products. For calculating the generalized criterion of estimation of ecological purity of textile products, it is offered to estimate each criterion by the points corresponding to each factor. The textile product is recognized as ecologically pure (environment friendly) if the total estimate is more than 1. (author)
Robust-BD Estimation and Inference for General Partially Linear Models
Directory of Open Access Journals (Sweden)
Chunming Zhang
2017-11-01
Full Text Available The classical quadratic loss for the partially linear model (PLM and the likelihood function for the generalized PLM are not resistant to outliers. This inspires us to propose a class of “robust-Bregman divergence (BD” estimators of both the parametric and nonparametric components in the general partially linear model (GPLM, which allows the distribution of the response variable to be partially specified, without being fully known. Using the local-polynomial function estimation method, we propose a computationally-efficient procedure for obtaining “robust-BD” estimators and establish the consistency and asymptotic normality of the “robust-BD” estimator of the parametric component β o . For inference procedures of β o in the GPLM, we show that the Wald-type test statistic W n constructed from the “robust-BD” estimators is asymptotically distribution free under the null, whereas the likelihood ratio-type test statistic Λ n is not. This provides an insight into the distinction from the asymptotic equivalence (Fan and Huang 2005 between W n and Λ n in the PLM constructed from profile least-squares estimators using the non-robust quadratic loss. Numerical examples illustrate the computational effectiveness of the proposed “robust-BD” estimators and robust Wald-type test in the appearance of outlying observations.
A Computer-Based Laboratory Project for the Study of Stimulus Generalization and Peak Shift
Derenne, Adam; Loshek, Eevett
2009-01-01
This paper describes materials designed for classroom projects on stimulus generalization and peak shift. A computer program (originally written in QuickBASIC) is used for data collection and a Microsoft Excel file with macros organizes the raw data on a spreadsheet and creates generalization gradients. The program is designed for use with human…
International Nuclear Information System (INIS)
Wu Di; Li Juan-Juan
2010-01-01
Based on the improved state observer and the pole placement technique, by adding a constant which extends the scope of use of the original system, a new design method of generalized projective synchronization is proposed. With this method, by changing the projective synchronization scale factor, one can achieve not only complete synchronization, but also anti-synchronization, as well as arbitrary percentage of projective synchronization, so that the system may attain arbitrary synchronization in a relatively short period of time, which makes this study more meaningful. By numerical simulation, and choosing appropriate scale factor, the results of repeated experiments verify that this method is highly effective and satisfactory. Finally, based on this method and the relevant feedback concept, a novel secure communication project is designed. Numerical simulation verifies that this secure communication project is very valid, and moreover, the experimental result has been greatly improved in decryption time. (general)
Generalized algorithm for X-ray projections generation in cone-beam tomography
International Nuclear Information System (INIS)
Qin Zhongyuan; Mu Xuanqin; Wang Ping; Cai Yuanlong; Hou Chuanjian
2002-01-01
In order to get rid of random factors in the measurement so as to support proceeding 3D reconstruction, a general approach is presented to obtain the X-ray projections in cone-beam tomography. The phantom is firstly discretized into cubic volume through inverse transformation then a generalized projection procedure is proposed to the digitized result without concerning what the phantom exactly is. In the second step, line integrals are calculated to obtain the projection of each X-ray through accumulation of tri-linear interpolation. Considering projection angles, a rotation matrix is proposed to the X-ray source and the detector plane so projections in arbitrary angles can be got. In this approach the algorithm is easy to be extended and irregular objects can also be processed. The algorithm is implemented in Visual C++ and experiments are done using different models. Satisfactory results are obtained. It makes good preparation for the proceeding reconstruction
de Sousa-Uva, Mafalda; Antunes, L; Nunes, B; Rodrigues, A P; Simões, J A; Ribeiro, R T; Boavida, J M; Matias-Dias, C
2016-10-01
Diabetes is known as a major cause of morbidity and mortality worldwide. Portugal is known as the European country with the highest prevalence of this disease. While diabetes prevalence data is updated annually in Portugal, the General Practitioner's (GP) Sentinel Network represents the only data source on diabetes incidence. This study describes the trends in Diabetes incidence, between 1992 and 2015, and estimate projections for the future incidence rates in Portugal until 2024. An ecological time-series study was conducted using data from GP Sentinel Network between 1992 and 2015. Family doctors reported all new cases of Diabetes in their patients' lists. Annual trends were estimated through Poisson regression models as well as the future incidence rates (until 2024), sex and age group stratified. Incidence rate projections were adjusted to the distribution of the resident Portuguese population given Statistics Portugal projections. The average increase in Diabetes incidence rate was in total 4.29% (CI95% 3.80-4.80) per year under study. Until 1998-2000, the annual incidence rate was higher in women, and from 1998-2000 to 2013-2015 turn out to be higher in men. The incidence rate projected for 2022-2024 was 972.77/10(5) inhabitants in total, and 846.74/10(5) and 1114.42/10(5), respectively, in women and men. This is the first study in Portugal to estimate diabetes incidence rate projections. The disturbing reported projections seem realistic if things continue as in the past. Actually, effective public health policies will need to be undertaken to minimize this alarming future scenario. Copyright © 2016 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Yun-zhi Zou
2012-01-01
Full Text Available A new class of generalized dynamical systems involving generalized f-projection operators is introduced and studied in Banach spaces. By using the fixed-point theorem due to Nadler, the equilibrium points set of this class of generalized global dynamical systems is proved to be nonempty and closed under some suitable conditions. Moreover, the solutions set of the systems with set-valued perturbation is showed to be continuous with respect to the initial value.
Energy Technology Data Exchange (ETDEWEB)
Bifulco, P; Cesarelli, M; Roccasalva Firenze, M; Verso, E; Sansone, M; Bracale, M [University of Naples, Federico II, Electronic Engineering Department, Bioengineering Unit, Via Claudio, 21 - 80125 Naples (Italy)
1999-12-31
The aim of this study is to develop a method to estimate the 3D positioning of an anatomic structure using the knowledge of its volume (provided by CT or MRI) combined with a single radiographic projection. This method could be applied in stereotactic surgery or in the study of 3D body joints kinematics. The knowledge of the 3D anatomical structure, available from CT (or in future MRI) is used to estimate the orientation of the projection that better match the actual 2D available projection. For this purpose it was necessary to develop an algorithm to simulate the radiographic projections. The radiographic image formation process has been simulated utilizing the geometrical characteristics of a real radiographic device and the volumetric anatomical data of the patient, obtained by 3D diagnostic CT images. The position of the patient volume respect to the radiological device is estimated comparing the actual radiographic projection with those simulated, maximising a similarity index. To assess the estimation, the 3D positioning of a segmented vertebra has been used as a test volume. The assessment has been carried out only by means of simulation. Estimation errors have been statistically evaluated. Conditions of mispositioning and noise have been also considered. The results relative to the simulation show the feasibility of the method. From the analysis of the errors emerges that the searching procedure results robust respect to the addition of white Gaussian noise. (authors) 13 fers., 4 figs., 1 tabs.
International Nuclear Information System (INIS)
Bifulco, P.; Cesarelli, M.; Roccasalva Firenze, M.; Verso, E.; Sansone, M.; Bracale, M.
1998-01-01
The aim of this study is to develop a method to estimate the 3D positioning of an anatomic structure using the knowledge of its volume (provided by CT or MRI) combined with a single radiographic projection. This method could be applied in stereotactic surgery or in the study of 3D body joints kinematics. The knowledge of the 3D anatomical structure, available from CT (or in future MRI) is used to estimate the orientation of the projection that better match the actual 2D available projection. For this purpose it was necessary to develop an algorithm to simulate the radiographic projections. The radiographic image formation process has been simulated utilizing the geometrical characteristics of a real radiographic device and the volumetric anatomical data of the patient, obtained by 3D diagnostic CT images. The position of the patient volume respect to the radiological device is estimated comparing the actual radiographic projection with those simulated, maximising a similarity index. To assess the estimation, the 3D positioning of a segmented vertebra has been used as a test volume. The assessment has been carried out only by means of simulation. Estimation errors have been statistically evaluated. Conditions of mispositioning and noise have been also considered. The results relative to the simulation show the feasibility of the method. From the analysis of the errors emerges that the searching procedure results robust respect to the addition of white Gaussian noise. (authors)
Directory of Open Access Journals (Sweden)
van den Dungen C
2011-11-01
Full Text Available Abstract Background General practice based registration networks (GPRNs provide information on morbidity rates in the population. Morbidity rate estimates from different GPRNs, however, reveal considerable, unexplained differences. We studied the range and variation in morbidity estimates, as well as the extent to which the differences in morbidity rates between general practices and networks change if socio-demographic characteristics of the listed patient populations are taken into account. Methods The variation in incidence and prevalence rates of thirteen diseases among six Dutch GPRNs and the influence of age, gender, socio economic status (SES, urbanization level, and ethnicity are analyzed using multilevel logistic regression analysis. Results are expressed in median odds ratios (MOR. Results We observed large differences in morbidity rate estimates both on the level of general practices as on the level of networks. The differences in SES, urbanization level and ethnicity distribution among the networks' practice populations are substantial. The variation in morbidity rate estimates among networks did not decrease after adjusting for these socio-demographic characteristics. Conclusion Socio-demographic characteristics of populations do not explain the differences in morbidity estimations among GPRNs.
Biermans, M.C.J.; Verheij, R.A.; Bakker, D.H. de; Zielhuis, G.A.; Vries Robbé, P.F. de
2008-01-01
Objectives: In this study, we evaluated the internal validity of EPICON, an application for grouping ICPCcoded diagnoses from electronic medical records into episodes of care. These episodes are used to estimate morbidity rates in general practice. Methods: Morbidity rates based on EPICON were
A multivariate family-based association test using generalized estimating equations : FBAT-GEE
Lange, C; Silverman, SK; Xu, [No Value; Weiss, ST; Laird, NM
In this paper we propose a multivariate extension of family-based association tests based on generalized estimating equations. The test can be applied to multiple phenotypes and to phenotypic data obtained in longitudinal studies without making any distributional assumptions for the phenotypic
Application of generalized estimating equations to a study in vitro of radiation sensitivity
International Nuclear Information System (INIS)
Cologne, J.B.; Carter, R.L.; Fujita, Shoichiro; Ban, Sadayuki.
1993-08-01
We describes an application of the generalized estimating equation (GEE) method (Liang K-Y, Zeger SL: Longitudinal data analysis using generalized linear models. Biometrika 73:13-22, 1986) for regression analyses of correlated Poisson data. As an alternative to the use of an arbitrarily chosen working correlation matrix, we demonstrate the use of GEE with a reasonable model for the true covariance structure among repeated observations within individuals. We show that, under such a split-plot design with large clusters, the asymptotic relative efficiency of GEE with simple (independence or exchangeable) working correlation matrices is rather low. We also illustrate the use of GEE with an empirically estimated model for overdispersion in a large study of radiation sensitivity where cluster size is small and a simple working correlation structure is sufficient. We conclude by summarizing issues and needs for further work concerning efficiency of the GEE parameter estimates in practice. (author)
Generalized synchronization-based multiparameter estimation in modulated time-delayed systems
Ghosh, Dibakar; Bhattacharyya, Bidyut K.
2011-09-01
We propose a nonlinear active observer based generalized synchronization scheme for multiparameter estimation in time-delayed systems with periodic time delay. A sufficient condition for parameter estimation is derived using Krasovskii-Lyapunov theory. The suggested tool proves to be globally and asymptotically stable by means of Krasovskii-Lyapunov method. With this effective method, parameter identification and generalized synchronization of modulated time-delayed systems with all the system parameters unknown, can be achieved simultaneously. We restrict our study for multiple parameter estimation in modulated time-delayed systems with single state variable only. Theoretical proof and numerical simulation demonstrate the effectiveness and feasibility of the proposed technique. The block diagram of electronic circuit for multiple time delay system shows that the method is easily applicable in practical communication problems.
International Nuclear Information System (INIS)
Wu, Xiangjun; Fu, Zhengye; Kurths, Jürgen
2015-01-01
In this paper, a new five-dimensional hyperchaotic system is proposed based on the Lü hyperchaotic system. Some of its basic dynamical properties, such as equilibria, Lyapunov exponents, bifurcations and various attractors are investigated. Furthermore, a new secure communication scheme based on generalized function projective synchronization (GFPS) of this hyperchaotic system with an uncertain parameter is presented. The communication scheme is composed of the modulation, the chaotic receiver, the chaotic transmitter and the demodulation. The modulation mechanism is to modulate the message signal into the system parameter. Then the chaotic signals are sent to the receiver via a public channel. In the receiver end, by designing the controllers and the parameter update rule, GFPS between the transmitter and receiver systems is achieved and the unknown parameter is estimated simultaneously. The message signal can be finally recovered by the identified parameter and the corresponding demodulation method. There is no any limitation on the message size. Numerical simulations are performed to show the validity and feasibility of the presented secure communication scheme. (paper)
Lake Darling Flood Control Project, Souris River, North Dakota. General Project Design.
1983-06-01
difference between the air temperature measured at 10 feet and snow surface temperature. Td ’ - is the difference between dewpoint temperature measured at 10...to atmospheric conditions in the river reach between the outlet of the dam and the gage. f. Total dissolved solids ( TDS ) are generally considered to be...C c o o c a m m o i p - g - *tm-ett 41JI IhcaU ICI (U 4 ) flU)WwooQ- flJ ~rVlrnj M W Nfr- r- 0 r- M WMM - I# )-z .Lr n LZ n Lrr r 4r 111
Hoyer, Erik H; Friedman, Michael; Lavezza, Annette; Wagner-Kosmakos, Kathleen; Lewis-Cherry, Robin; Skolnik, Judy L; Byers, Sherrie P; Atanelov, Levan; Colantuoni, Elizabeth; Brotman, Daniel J; Needham, Dale M
2016-05-01
To determine whether a multidisciplinary mobility promotion quality-improvement (QI) project would increase patient mobility and reduce hospital length of stay (LOS). Implemented using a structured QI model, the project took place between March 1, 2013 and March 1, 2014 on 2 general medicine units in a large academic medical center. There were 3352 patients admitted during the QI project period. The Johns Hopkins Highest Level of Mobility (JH-HLM) scale, an 8-point ordinal scale ranging from bed rest (score = 1) to ambulating ≥250 feet (score = 8), was used to quantify mobility. Changes in JH-HLM scores were compared for the first 4 months of the project (ramp-up phase) versus 4 months after project completion (post-QI phase) using generalized estimating equations. We compared the relative change in median LOS for the project months versus 12 months prior among the QI units, using multivariable linear regression analysis adjusting for 7 demographic and clinically relevant variables. Comparing the ramp-up versus post-QI phases, patients reaching JH-HLM's ambulation status increased from 43% to 70% (P mobility scores between admission and discharge increased from 32% to 45% (P 7 days), were associated with a significantly greater adjusted median reduction in LOS of 1.11 (95% CI: -1.53 to -0.65, P mobility was not associated with an increase in injurious falls compared to 12 months prior on the QI units (P = 0.73). Active prevention of a decline in physical function that commonly occurs during hospitalization may be achieved with a structured QI approach. In an adult medicine population, our QI project was associated with improved mobility, and this may have contributed to a reduction in LOS, particularly for more complex patients with longer expected hospital stay. Journal of Hospital Medicine 2016. © 2016 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.
Early cost estimating for road construction projects using multiple regression techniques
Directory of Open Access Journals (Sweden)
Ibrahim Mahamid
2011-12-01
Full Text Available The objective of this study is to develop early cost estimating models for road construction projects using multiple regression techniques, based on 131 sets of data collected in the West Bank in Palestine. As the cost estimates are required at early stages of a project, considerations were given to the fact that the input data for the required regression model could be easily extracted from sketches or scope definition of the project. 11 regression models are developed to estimate the total cost of road construction project in US dollar; 5 of them include bid quantities as input variables and 6 include road length and road width. The coefficient of determination r2 for the developed models is ranging from 0.92 to 0.98 which indicate that the predicted values from a forecast models fit with the real-life data. The values of the mean absolute percentage error (MAPE of the developed regression models are ranging from 13% to 31%, the results compare favorably with past researches which have shown that the estimate accuracy in the early stages of a project is between ±25% and ±50%.
Carbon Footprint Estimation Tool for Residential Buildings for Non-Specialized Users: OERCO2 Project
Directory of Open Access Journals (Sweden)
Jaime Solís-Guzmán
2018-04-01
Full Text Available Existing tools for environmental certification of buildings are failing in their ability to reach the general public and to create social awareness, since they require not only specialized knowledge regarding construction and energy sources, but also environmental knowledge. In this paper, an open-source online tool for the estimation of the carbon footprint of residential buildings by non-specialized users is presented as a product from the OERCO2 Erasmus + project. The internal calculations, data management and operation of this tool are extensively explained. The ten most common building typologies built in the last decade in Spain are analysed by using the OERCO2 tool, and the order of magnitude of the results is analysed by comparing them to the ranges determined by other authors. The OERCO2 tool proves itself to be reliable, with its results falling within the defined logical value ranges. Moreover, the major simplification of the interface allows non-specialized users to evaluate the sustainability of buildings. Further research is oriented towards its inclusion in other environmental certification tools and in Building Information Modeling (BIM environments.
A New 4D Hyperchaotic System and Its Generalized Function Projective Synchronization
Directory of Open Access Journals (Sweden)
Yuan Gao
2013-01-01
Full Text Available A new four-dimensional hyperchaotic system is investigated. Numerical and analytical studies are carried out on its basic dynamical properties, such as equilibrium point, Lyapunov exponents, Poincaré maps, and chaotic dynamical behaviors. We verify the realizability of the new system via an electronic circuit by using Multisim software. Furthermore, a generalized function projective synchronization scheme of two different hyperchaotic systems with uncertain parameters is proposed, which includes some existing projective synchronization schemes, such as generalized projection synchronization and function projective synchronization. Based on the Lyapunov stability theory, a controller with parameters update laws is designed to realize synchronization. Using this controller, we realize the synchronization between Chen hyperchaotic system and the new system to verify the validity and feasibility of our method.
International Nuclear Information System (INIS)
Sokoli, Florian; Alber, Gernot
2014-01-01
Projective norms are capable of measuring entanglement of multipartite quantum states. However, typically, the explicit computation of these distance-based geometric entanglement monotones is very difficult even for finite dimensional systems. Motivated by the significance of Schmidt decompositions for our quantitative understanding of bipartite quantum entanglement, a generalization of this concept to multipartite scenarios is proposed, in the sense that generalized Schmidt decomposability of a multipartite pure state implies that its projective norm can be calculated in a simple way analogous to the bipartite case. Thus, this concept of generalized Schmidt decomposability of multipartite quantum states is linked in a natural way to projective norms as entanglement monotones. Therefore, it may not only be a convenient tool for calculations, but may also shed new light onto the intricate features of multipartite entanglement in an analogous way as the ‘classical’ Schmidt decomposition does for bipartite quantum systems. (paper)
Risk Consideration and Cost Estimation in Construction Projects Using Monte Carlo Simulation
Directory of Open Access Journals (Sweden)
Claudius A. Peleskei
2015-06-01
Full Text Available Construction projects usually involve high investments. It is, therefore, a risky adventure for companies as actual costs of construction projects nearly always exceed the planed scenario. This is due to the various risks and the large uncertainty existing within this industry. Determination and quantification of risks and their impact on project costs within the construction industry is described to be one of the most difficult areas. This paper analyses how the cost of construction projects can be estimated using Monte Carlo Simulation. It investigates if the different cost elements in a construction project follow a specific probability distribution. The research examines the effect of correlation between different project costs on the result of the Monte Carlo Simulation. The paper finds out that Monte Carlo Simulation can be a helpful tool for risk managers and can be used for cost estimation of construction projects. The research has shown that cost distributions are positively skewed and cost elements seem to have some interdependent relationships.
Liu, Qingshan; Cao, Jinde
2010-06-01
Based on the projection operator, a recurrent neural network is proposed for solving extended general variational inequalities (EGVIs). Sufficient conditions are provided to ensure the global convergence of the proposed neural network based on Lyapunov methods. Compared with the existing neural networks for variational inequalities, the proposed neural network is a modified version of the general projection neural network existing in the literature and capable of solving the EGVI problems. In addition, simulation results on numerical examples show the effectiveness and performance of the proposed neural network.
Tang, Yang; Fang, Jian-an
2008-03-01
This work is concerned with the general methods for modified projective synchronization of hyperchaotic systems. A systematic method of active control is developed to synchronize two hyperchaotic systems with known parameters. Moreover, by combining the adaptive control and linear feedback methods, general sufficient conditions for the modified projective synchronization of identical or different chaotic systems with fully unknown or partially unknown parameters are presented. Meanwhile, the speed of parameters identification can be regulated by adjusting adaptive gain matrix. Numerical simulations verify the effectiveness of the proposed methods.
The SKI-project External events - Phase 2. Estimation of fire frequencies per plant and per building
International Nuclear Information System (INIS)
Poern, K.
1996-08-01
The Swedish-Finnish handbook for initiating event frequencies, I-Book, does not contain any fire frequencies. This matter of fact is not defensible considering the substantial risk contribution caused by fires. In the PSAs performed hitherto the initiating fire frequencies have been determined from case to case. Because data are usually very scarce in these areas it is very important to develop unique definitions, to systematically utilize both international and national experiences and to establish an appropriate statistical estimation method. It is also important to present the accumulated experience such that it can be used for different purposes, not only within PSA but also in the concrete fire preventive work. During phase 1 of the project External Events an inventory was made of existing methods for probabilistic fire analysis in general. During phase 2 of the project it was decided to initialize the work on a complementary handbook, called X-Book, in order to encompass the frequencies of system external events, i.e. initiating events that are caused by events occurring outside the system boundaries. In Version 1 of the X-Book the attention is mainly focussed on the estimation of initiating fire frequencies, per plant and per building. This estimation is basically founded on reports that the power companies have collected for this specific purpose. This report describes the statistical model and method that have been used in the estimation process. The methodological results achieved may, possibly after some modification, be applicable also to other types of system external events
Han, Fang; Liu, Han
2017-02-01
Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.
Directory of Open Access Journals (Sweden)
Sanjay Kumar Singh
2011-06-01
Full Text Available In this Paper we propose Bayes estimators of the parameters of Exponentiated Exponential distribution and Reliability functions under General Entropy loss function for Type II censored sample. The proposed estimators have been compared with the corresponding Bayes estimators obtained under Squared Error loss function and maximum likelihood estimators for their simulated risks (average loss over sample space.
Balancing uncertainty of context in ERP project estimation: an approach and a case study
Daneva, Maia
2010-01-01
The increasing demand for Enterprise Resource Planning (ERP) solutions as well as the high rates of troubled ERP implementations and outright cancellations calls for developing effort estimation practices to systematically deal with uncertainties in ERP projects. This paper describes an approach -
Modified generalized method of moments for a robust estimation of polytomous logistic model
Directory of Open Access Journals (Sweden)
Xiaoshan Wang
2014-07-01
Full Text Available The maximum likelihood estimation (MLE method, typically used for polytomous logistic regression, is prone to bias due to both misclassification in outcome and contamination in the design matrix. Hence, robust estimators are needed. In this study, we propose such a method for nominal response data with continuous covariates. A generalized method of weighted moments (GMWM approach is developed for dealing with contaminated polytomous response data. In this approach, distances are calculated based on individual sample moments. And Huber weights are applied to those observations with large distances. Mellow-type weights are also used to downplay leverage points. We describe theoretical properties of the proposed approach. Simulations suggest that the GMWM performs very well in correcting contamination-caused biases. An empirical application of the GMWM estimator on data from a survey demonstrates its usefulness.
Tan, Jun; Nie, Zaiping
2018-05-12
Direction of Arrival (DOA) estimation of low-altitude targets is difficult due to the multipath coherent interference from the ground reflection image of the targets, especially for very high frequency (VHF) radars, which have antennae that are severely restricted in terms of aperture and height. The polarization smoothing generalized multiple signal classification (MUSIC) algorithm, which combines polarization smoothing and generalized MUSIC algorithm for polarization sensitive arrays (PSAs), was proposed to solve this problem in this paper. Firstly, the polarization smoothing pre-processing was exploited to eliminate the coherence between the direct and the specular signals. Secondly, we constructed the generalized MUSIC algorithm for low angle estimation. Finally, based on the geometry information of the symmetry multipath model, the proposed algorithm was introduced to convert the two-dimensional searching into one-dimensional searching, thus reducing the computational burden. Numerical results were provided to verify the effectiveness of the proposed method, showing that the proposed algorithm has significantly improved angle estimation performance in the low-angle area compared with the available methods, especially when the grazing angle is near zero.
National data analysis of general radiography projection method in medical imaging
Energy Technology Data Exchange (ETDEWEB)
Kim, Jung Su; Seo, Deok Nam; Choi, In Seok [Dept. of Bio-Convergence Engineering, Korea University Graduate School, Seoul (Korea, Republic of); and others
2014-09-15
According to database of medical institutions of health insurance review and assessment service in 2013, 1,118 hospitals and clinics have department of radiology in Korea. And there are CT, fluoroscopic and general radiographic equipment in those hospitals. Above all, general radiographic equipment is the most commonly used in the radiology department. And most of the general radiographic equipment are changing the digital radiography system from the film-screen types of the radiography system nowadays. However, most of the digital radiography department are used the film-screen types of the radiography system. Therefore, in this study, we confirmed present conditions of technical items for general radiography used in hospital and research on general radiographic techniques in domestic medical institutions. We analyzed 26 radiography projection method including chest, skull, spine and pelvis which are generally used in the radiography department.
Directory of Open Access Journals (Sweden)
P. Ribereau
2008-12-01
Full Text Available Since the pioneering work of Landwehr et al. (1979, Hosking et al. (1985 and their collaborators, the Probability Weighted Moments (PWM method has been very popular, simple and efficient to estimate the parameters of the Generalized Extreme Value (GEV distribution when modeling the distribution of maxima (e.g., annual maxima of precipitations in the Identically and Independently Distributed (IID context. When the IID assumption is not satisfied, a flexible alternative, the Maximum Likelihood Estimation (MLE approach offers an elegant way to handle non-stationarities by letting the GEV parameters to be time dependent. Despite its qualities, the MLE applied to the GEV distribution does not always provide accurate return level estimates, especially for small sample sizes or heavy tails. These drawbacks are particularly true in some non-stationary situations. To reduce these negative effects, we propose to extend the PWM method to a more general framework that enables us to model temporal covariates and provide accurate GEV-based return levels. Theoretical properties of our estimators are discussed. Small and moderate sample sizes simulations in a non-stationary context are analyzed and two brief applications to annual maxima of CO_{2} and seasonal maxima of cumulated daily precipitations are presented.
International Nuclear Information System (INIS)
Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup
2015-01-01
In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF
Energy Technology Data Exchange (ETDEWEB)
Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)
2015-02-15
In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.
International Nuclear Information System (INIS)
Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.
2001-01-01
The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of
Working with Climate Projections to Estimate Disease Burden: Perspectives from Public Health
Directory of Open Access Journals (Sweden)
Kathryn C. Conlon
2016-08-01
Full Text Available There is interest among agencies and public health practitioners in the United States (USA to estimate the future burden of climate-related health outcomes. Calculating disease burden projections can be especially daunting, given the complexities of climate modeling and the multiple pathways by which climate influences public health. Interdisciplinary coordination between public health practitioners and climate scientists is necessary for scientifically derived estimates. We describe a unique partnership of state and regional climate scientists and public health practitioners assembled by the Florida Building Resilience Against Climate Effects (BRACE program. We provide a background on climate modeling and projections that has been developed specifically for public health practitioners, describe methodologies for combining climate and health data to project disease burden, and demonstrate three examples of this process used in Florida.
Sparse Adaptive Channel Estimation Based on lp-Norm-Penalized Affine Projection Algorithm
Directory of Open Access Journals (Sweden)
Yingsong Li
2014-01-01
Full Text Available We propose an lp-norm-penalized affine projection algorithm (LP-APA for broadband multipath adaptive channel estimations. The proposed LP-APA is realized by incorporating an lp-norm into the cost function of the conventional affine projection algorithm (APA to exploit the sparsity property of the broadband wireless multipath channel, by which the convergence speed and steady-state performance of the APA are significantly improved. The implementation of the LP-APA is equivalent to adding a zero attractor to its iterations. The simulation results, which are obtained from a sparse channel estimation, demonstrate that the proposed LP-APA can efficiently improve channel estimation performance in terms of both the convergence speed and steady-state performance when the channel is exactly sparse.
International Nuclear Information System (INIS)
Alvarez, G.; Li Shujun; Montoya, F.; Pastor, G.; Romera, M.
2005-01-01
This paper describes the security weaknesses of a recently proposed secure communication method based on chaotic masking using projective synchronization of two chaotic systems. We show that the system is insecure and how to break it in two different ways, by high-pass filtering and by generalized synchronization
On the K(a)hler-Ricci Flow on Projective Manifolds of General Type
Institute of Scientific and Technical Information of China (English)
Gang TIAN; Zhou ZHANG
2006-01-01
This note concerns the global existence and convergence of the solution for K(a)hler-Ricci flow equation when the canonical class, Kx, is numerically effective and big.We clarify some known results regarding this flow on projective manifolds of general type and also show some new observations and refined results.
2010-07-01
... more of the following types of activities, as specified in §§ 350.13-350.19: (1) Research. (2... 34 Education 2 2010-07-01 2010-07-01 false What are the general requirements for Disability and Rehabilitation Research Projects? 350.10 Section 350.10 Education Regulations of the Offices of the Department of...
Generalized projective synchronization of chaotic nonlinear gyros coupled with dead-zone input
International Nuclear Information System (INIS)
Hung, M.-L.; Yan, J.-J.; Liao, T.-L.
2008-01-01
This paper addresses the synchronization problem of drive-response chaotic gyros coupled with dead-zone nonlinear input. Using the sliding mode control technique, a novel control law is established which guarantees generalized projective synchronization even when the dead-zone nonlinearity is present. Numerical simulations are presented to verify that the synchronization can be achieved by using the proposed synchronization scheme
A projection-based approach to general-form Tikhonov regularization
DEFF Research Database (Denmark)
Kilmer, Misha E.; Hansen, Per Christian; Espanol, Malena I.
2007-01-01
We present a projection-based iterative algorithm for computing general-form Tikhonov regularized solutions to the problem minx| Ax-b |2^2+lambda2| Lx |2^2, where the regularization matrix L is not the identity. Our algorithm is designed for the common case where lambda is not known a priori...
International Nuclear Information System (INIS)
Wang, Cong; Zhang, Hong-li; Fan, Wen-hui
2017-01-01
In this paper, we propose a new method to improve the safety of secure communication. This method uses the generalized dislocated lag projective synchronization and function projective synchronization to form a new generalized dislocated lag function projective synchronization. Moreover, this paper takes the examples of fractional order Chen system and Lü system with uncertain parameters as illustration. As the parameters of the two systems are uncertain, the nonlinear controller and parameter update algorithms are designed based on the fractional stability theory and adaptive control method. Moreover, this synchronization form and method of control are applied to secure communication via chaotic masking modulation. Many information signals can be recovered and validated. Finally, simulations are used to show the validity and feasibility of the proposed scheme.
A General Method to Estimate Earthquake Moment and Magnitude using Regional Phase Amplitudes
Energy Technology Data Exchange (ETDEWEB)
Pasyanos, M E
2009-11-19
This paper presents a general method of estimating earthquake magnitude using regional phase amplitudes, called regional M{sub o} or regional M{sub w}. Conceptually, this method uses an earthquake source model along with an attenuation model and geometrical spreading which accounts for the propagation to utilize regional phase amplitudes of any phase and frequency. Amplitudes are corrected to yield a source term from which one can estimate the seismic moment. Moment magnitudes can then be reliably determined with sets of observed phase amplitudes rather than predetermined ones, and afterwards averaged to robustly determine this parameter. We first examine in detail several events to demonstrate the methodology. We then look at various ensembles of phases and frequencies, and compare results to existing regional methods. We find regional M{sub o} to be a stable estimator of earthquake size that has several advantages over other methods. Because of its versatility, it is applicable to many more events, particularly smaller events. We make moment estimates for earthquakes ranging from magnitude 2 to as large as 7. Even with diverse input amplitude sources, we find magnitude estimates to be more robust than typical magnitudes and existing regional methods and might be tuned further to improve upon them. The method yields a more meaningful quantity of seismic moment, which can be recast as M{sub w}. Lastly, it is applied here to the Middle East region using an existing calibration model, but it would be easy to transport to any region with suitable attenuation calibration.
International Nuclear Information System (INIS)
Bavio, José; Marrón, Beatriz
2014-01-01
Quality of service (QoS) for internet traffic management requires good traffic models and good estimation of sharing network resource. A link of a network processes all traffic and it is designed with certain capacity C and buffer size B. A Generalized Markov Fluid model (GMFM), introduced by Marrón (2011), is assumed for the sources because describes in a versatile way the traffic, allows estimation based on traffic traces, and also consistent effective bandwidth estimation can be done. QoS, interpreted as buffer overflow probability, can be estimated for GMFM through the effective bandwidth estimation and solving the optimization problem presented in Courcoubetis (2002), the so call inf-sup formulas. In this work we implement a code to solve the inf-sup problem and other optimization related with it, that allow us to do traffic engineering in links of data networks to calculate both, minimum capacity required when QoS and buffer size are given or minimum buffer size required when QoS and capacity are given
Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin
2017-10-30
Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.
General Aviation in Nebraska: Nebraska SATS Project Background Paper No. 1
Smith, Russell; Wachal, Jocelyn
2000-01-01
The Nebraska SATS project is a state-level component of NASA's Small Aircraft Transportation System (SATS). During the next several years the project will examine several different factors affecting SATS implementation in Nebraska. These include economic and taxation issues, public policy issues, airport planning processes, information dissemination strategies, and systemic change factors. This background paper profiles the general aviation system in Nebraska. It is written to provide information about the "context" within which SATS will be pursued. The primary focus is thus on describing and providing background information about the current situation. A secondary focus is on drawing general conclusions about the ability of the current system to incorporate the types of changes implied by SATS. First, some brief information on the U.S. aviation system is provided. The next two sections profile the current general aviation aircraft and pilot base. Nebraska's system of general aviation airports is then described. Within this section of the paper, information is provided on the different types of general aviation airports in Nebraska, airport activity levels and current infrastructure. The fourth major section of the background paper looks at Nebraska's local airport authorities. These special purpose local governments oversee the majority of the general aviation airports in the state. Among the items examined are total expenditures, capital expenditures and planning activities. Next, the paper provides background information on the Nebraska Department of Aeronautics (NDA) and recent Federal funding for general aviation in Nebraska. The final section presents summary conclusions.
International Nuclear Information System (INIS)
Andrasi, A.; Bailey, M.; Puncher, M.; Berkovski, V.; Eric Blanchardon, E.; Jourdain, J.-R.; Carlo-Maria Castellani, C.-M.; Doerfel, H.; Christian Hurtgen, Ch.; Le Guen, B.
2003-01-01
Several international intercomparison exercises on intake and internal dose assessments from monitoring data led to the conclusion that the results calculated by different participants varied significantly mainly because of the wide variety of methods and assumptions applied in the assessment procedure. Based on these experiences the need for harmonisation of the procedures has been formulated as an EU research project under the 5 th Framework Programme (2001-2005), with the aim of developing general guidelines for standardising assessments of intakes and internal doses. In the IDEAS project eight institutions from seven European countries are participating using inputs also from internal dosimetry professionals from across Europe to ensure broad consensus in the outcome of the project. The IDEAS project is explained
The demand for refined petroleum products in Iran: Estimation and projection
International Nuclear Information System (INIS)
Kianian, A.M.
1990-01-01
The estimation and projection of the demand for refined petroleum products of the OPEC states, are important for the world petroleum market from both the demand and supply sides. In this context, this study forms an econometric model to estimte the demand for the total and four major refined petroleum products (RPPs) in Iran and project their future trends into the year 2000. The fact that Iran has the largest domestic demand for RPPs among all the OPEC members has motivated some research primarily to study the structure of the demand for such products. None, however, has utilized econometric models to estimate or project the demand for RPPs. The first section of this study discusses the structure of the Iranian energy market. Next, the demand functions for gasoline, kerosine, gas oil, fuel oil, and the total RPPs in Iran are estimated. The third section puts together the demand functions to form a model used to project the demand for RPPs up to the year 2000 under an historical scenario. Finally, some condlusions are offered. 7 tabs
Estimating Climatological Bias Errors for the Global Precipitation Climatology Project (GPCP)
Adler, Robert; Gu, Guojun; Huffman, George
2012-01-01
A procedure is described to estimate bias errors for mean precipitation by using multiple estimates from different algorithms, satellite sources, and merged products. The Global Precipitation Climatology Project (GPCP) monthly product is used as a base precipitation estimate, with other input products included when they are within +/- 50% of the GPCP estimates on a zonal-mean basis (ocean and land separately). The standard deviation s of the included products is then taken to be the estimated systematic, or bias, error. The results allow one to examine monthly climatologies and the annual climatology, producing maps of estimated bias errors, zonal-mean errors, and estimated errors over large areas such as ocean and land for both the tropics and the globe. For ocean areas, where there is the largest question as to absolute magnitude of precipitation, the analysis shows spatial variations in the estimated bias errors, indicating areas where one should have more or less confidence in the mean precipitation estimates. In the tropics, relative bias error estimates (s/m, where m is the mean precipitation) over the eastern Pacific Ocean are as large as 20%, as compared with 10%-15% in the western Pacific part of the ITCZ. An examination of latitudinal differences over ocean clearly shows an increase in estimated bias error at higher latitudes, reaching up to 50%. Over land, the error estimates also locate regions of potential problems in the tropics and larger cold-season errors at high latitudes that are due to snow. An empirical technique to area average the gridded errors (s) is described that allows one to make error estimates for arbitrary areas and for the tropics and the globe (land and ocean separately, and combined). Over the tropics this calculation leads to a relative error estimate for tropical land and ocean combined of 7%, which is considered to be an upper bound because of the lack of sign-of-the-error canceling when integrating over different areas with a
Yu, M.; Wu, B.
2017-12-01
As an important part of the coupled Eco-Hydrological processes, evaporation is the bond for exchange of energy and heat between the surface and the atmosphere. However, the estimation of evaporation remains a challenge compared with other main hydrological factors in water cycle. The complementary relationship which proposed by Bouchet (1963) has laid the foundation for various approaches to estimate evaporation from land surfaces, the essence of the principle is a relationship between three types of evaporation in the environment. It can simply implemented with routine meteorological data without the need for resistance parameters of the vegetation and bare land, which are difficult to observed and complicated to estimate in most surface flux models. On this basis the generalized nonlinear formulation was proposed by Brutsaert (2015). The daily evaporation can be estimated once the potential evaporation (Epo) and apparent potential evaporation (Epa) are known. The new formulation has a strong physical basis and can be expected to perform better under natural water stress conditions, nevertheless, the model has not been widely validated over different climate types and underlying surface patterns. In this study, we attempted to apply the generalized nonlinear complementary relationship in North China, three flux stations in North China are used for testing the universality and accuracy of this model against observed evaporation over different vegetation types, including Guantao Site, Miyun Site and Huailai Site. Guantao Site has double-cropping systems and crop rotations with summer maize and winter wheat; the other two sites are dominated by spring maize. Detailed measurements of meteorological factors at certain heights above ground surface from automatic weather stations offered necessary parameters for daily evaporation estimation. Using the Bowen ratio, the surface energy measured by the eddy covariance systems at the flux stations is adjusted on a daily scale
PMP Estimations at Sparsely Controlled Andinian Basins and Climate Change Projections
Lagos Zúñiga, M. A.; Vargas, X.
2012-12-01
Probable Maximum Precipitation (PMP) estimation implies an extensive review of hydrometeorological data and understandig of precipitation formation processes. There exists different methodology processes that apply for their estimations and all of them require a good spatial and temporal representation of storms. The estimation of hydrometeorological PMP on sparsely controlled basins is a difficult task, specially if the studied area has an important orographic effect due to mountains and the mixed precipitation occurrence in the most several storms time period, the main task of this study is to propose and estimate PMP in a sparsely controlled basin, affected by abrupt topography and mixed hidrology basin; also analyzing statystic uncertainties estimations and possible climate changes effects in its estimation. In this study the PMP estimation under statistical and hydrometeorological aproaches (watershed-based and traditional depth area duration analysis) was done in a semi arid zone at Puclaro dam in north Chile. Due to the lack of good spatial meteorological representation at the study zone, we propose a methodology to consider the orographic effects of Los Andes due to orographic effects patterns based in a RCM PRECIS-DGF and annual isoyetal maps. Estimations were validated with precipitation patterns for given winters, considering snow route and rainfall gauges at the preferencial wind direction, finding good results. The estimations are also compared with the highest areal storms in USA, Australia, India and China and with frequency analysis in local rain gauge stations in order to decide about the most adequate approach for the study zone. Climate change projections were evaluated with ECHAM5 GCM model, due to its good quality representation in the seasonality and the magnitude of meteorological variables. Temperature projections, for 2040-2065 period, show that there would be a rise in the catchment contributing area that would lead to an increase of the
Directory of Open Access Journals (Sweden)
Kalpeshkumar Rohitbhai Patil
2016-10-01
Full Text Available Proper synchronization of Distributed Generator with grid and its performance in grid-connected mode relies on fast and precise estimation of phase and amplitude of the fundamental component of grid voltage. However, the accuracy with which the frequency is estimated is dependent on the type of grid voltage abnormalities and structure of the phase-locked loop or frequency locked loop control schemes. Among various control schemes, second-order generalized integrator based frequency- locked loop (SOGI-FLL is reported to have the most promising performance. It tracks the frequency of grid voltage accurately even when grid voltage is characterized by sag, swell, harmonics, imbalance, frequency variations etc. However, estimated frequency contains low frequency oscillations in case when sensed grid-voltage has a dc offset. This paper presents a modified dual second-order generalized integrator frequency-locked loop (MDSOGI-FLL for three-phase systems to cope with the non-ideal three-phase grid voltages having all type of abnormalities including the dc offset. The complexity in control scheme is almost the same as the standard dual SOGI-FLL, but the performance is enhanced. Simulation results show that the proposed MDSOGI-FLL is effective under all abnormal grid voltage conditions. The results are validated experimentally to justify the superior performance of MDSOGI-FLL under adverse conditions.
Why Don't They Just Give Us Money? Project Cost Estimating and Cost Reporting
Comstock, Douglas A.; Van Wychen, Kristin; Zimmerman, Mary Beth
2015-01-01
Successful projects require an integrated approach to managing cost, schedule, and risk. This is especially true for complex, multi-year projects involving multiple organizations. To explore solutions and leverage valuable lessons learned, NASA's Virtual Project Management Challenge will kick off a three-part series examining some of the challenges faced by project and program managers when it comes to managing these important elements. In this first session of the series, we will look at cost management, with an emphasis on the critical roles of cost estimating and cost reporting. By taking a proactive approach to both of these activities, project managers can better control life cycle costs, maintain stakeholder confidence, and protect other current and future projects in the organization's portfolio. Speakers will be Doug Comstock, Director of NASA's Cost Analysis Division, Kristin Van Wychen, Senior Analyst in the GAO Acquisition and Sourcing Management Team, and Mary Beth Zimmerman, Branch Chief for NASA's Portfolio Analysis Branch, Strategic Investments Division. Moderator Ramien Pierre is from NASA's Academy for Program/Project and Engineering Leadership (APPEL).
Implications of applying solar industry best practice resource estimation on project financing
International Nuclear Information System (INIS)
Pacudan, Romeo
2016-01-01
Solar resource estimation risk is one of the main solar PV project risks that influences lender’s decision in providing financing and in determining the cost of capital. More recently, a number of measures have emerged to mitigate this risk. The study focuses on solar industry’s best practice energy resource estimation and assesses its financing implications to the 27 MWp solar PV project study in Brunei Darussalam. The best practice in resource estimation uses multiple data sources through the measure-correlate-predict (MCP) technique as compared with the standard practice that rely solely on modelled data source. The best practice case generates resource data with lower uncertainty and yields superior high-confidence energy production estimate than the standard practice case. Using project financial parameters in Brunei Darussalam for project financing and adopting the international debt-service coverage ratio (DSCR) benchmark rates, the best practice case yields DSCRs that surpass the target rates while those of standard practice case stay below the reference rates. The best practice case could also accommodate higher debt share and have lower levelized cost of electricity (LCOE) while the standard practice case would require a lower debt share but having a higher LCOE. - Highlights: •Best practice solar energy resource estimation uses multiple datasets. •Multiple datasets are combined through measure-correlate-predict technique. •Correlated data have lower uncertainty and yields superior high-confidence energy production. •Best practice case yields debt-service coverage ratios (DSCRs) that surpass the benchmark rates. •Best practice case accommodates high debt share and have low levelized cost of electricity.
Chrysoulakis, Nektarios; Marconcini, Mattia; Gastellu-Etchegorry, Jean-Philippe; Grimmond, Sue; Feigenwinter, Christian; Lindberg, Fredrik; Del Frate, Fabio; Klostermann, Judith; Mitraka, Zina; Esch, Thomas; Landier, Lucas; Gabey, Andy; Parlow, Eberhard; Olofson, Frans
2017-04-01
The H2020-Space project URBANFLUXES (URBan ANthrpogenic heat FLUX from Earth observation Satellites) investigates the potential of Copernicus Sentinels to retrieve anthropogenic heat flux, as a key component of the Urban Energy Budget (UEB). URBANFLUXES advances the current knowledge of the impacts of UEB fluxes on urban heat island and consequently on energy consumption in cities. In URBANFLUXES, the anthropogenic heat flux is estimated as a residual of UEB. Therefore, the rest UEB components, namely, the net all-wave radiation, the net change in heat storage and the turbulent sensible and latent heat fluxes are independently estimated from Earth Observation (EO), whereas the advection term is included in the error of the anthropogenic heat flux estimation from the UEB closure. The Discrete Anisotropic Radiative Transfer (DART) model is employed to improve the estimation of the net all-wave radiation balance, whereas the Element Surface Temperature Method (ESTM), adjusted to satellite observations is used to improve the estimation the estimation of the net change in heat storage. Furthermore the estimation of the turbulent sensible and latent heat fluxes is based on the Aerodynamic Resistance Method (ARM). Based on these outcomes, QF is estimated by regressing the sum of the turbulent heat fluxes versus the available energy. In-situ flux measurements are used to evaluate URBANFLUXES outcomes, whereas uncertainties are specified and analyzed. URBANFLUXES is expected to prepare the ground for further innovative exploitation of EO in scientific activities (climate variability studies at local and regional scales) and future and emerging applications (sustainable urban planning, mitigation technologies) to benefit climate change mitigation/adaptation. This study presents the results of the second phase of the project and detailed information on URBANFLUXES is available at: http://urbanfluxes.eu
Directory of Open Access Journals (Sweden)
Iman Yousefi
2015-01-01
Full Text Available This paper presents parameter estimation of Permanent Magnet Synchronous Motor (PMSM using a combinatorial algorithm. Nonlinear fourth-order space state model of PMSM is selected. This model is rewritten to the linear regression form without linearization. Noise is imposed to the system in order to provide a real condition, and then combinatorial Orthogonal Projection Algorithm and Recursive Least Squares (OPA&RLS method is applied in the linear regression form to the system. Results of this method are compared to the Orthogonal Projection Algorithm (OPA and Recursive Least Squares (RLS methods to validate the feasibility of the proposed method. Simulation results validate the efficacy of the proposed algorithm.
International Nuclear Information System (INIS)
Song, N; Frey, E C; He, B; Wahl, R L
2011-01-01
Optimizing targeted radionuclide therapy requires patient-specific estimation of organ doses. The organ doses are estimated from quantitative nuclear medicine imaging studies, many of which involve planar whole body scans. We have previously developed the quantitative planar (QPlanar) processing method and demonstrated its ability to provide more accurate activity estimates than conventional geometric-mean-based planar (CPlanar) processing methods using physical phantom and simulation studies. The QPlanar method uses the maximum likelihood-expectation maximization algorithm, 3D organ volume of interests (VOIs), and rigorous models of physical image degrading factors to estimate organ activities. However, the QPlanar method requires alignment between the 3D organ VOIs and the 2D planar projections and assumes uniform activity distribution in each VOI. This makes application to patients challenging. As a result, in this paper we propose an extended QPlanar (EQPlanar) method that provides independent-organ rigid registration and includes multiple background regions. We have validated this method using both Monte Carlo simulation and patient data. In the simulation study, we evaluated the precision and accuracy of the method in comparison to the original QPlanar method. For the patient studies, we compared organ activity estimates at 24 h after injection with those from conventional geometric mean-based planar quantification using a 24 h post-injection quantitative SPECT reconstruction as the gold standard. We also compared the goodness of fit of the measured and estimated projections obtained from the EQPlanar method to those from the original method at four other time points where gold standard data were not available. In the simulation study, more accurate activity estimates were provided by the EQPlanar method for all the organs at all the time points compared with the QPlanar method. Based on the patient data, we concluded that the EQPlanar method provided a
General guidance and procedures for estimating and reporting national GHG emissions for agriculture
International Nuclear Information System (INIS)
Rypdal, K.
2002-01-01
Greenhouse gas (GHG) emissions from agriculture account for a large share of total GHG emissions in most countries. Methane from ruminants, animal manure and rice fields, and nitrous oxide from agricultural soils are among the most important sources. In general, these emission estimates also are more uncertain than most other parts of the GHG emission inventory. IPCC has developed guidelines for estimating and reporting emissions of GHG. These guidelines shall be followed to secure complete, consistent, accurate and transparent reporting of emissions. However, the recommended methodologies are tiered, and choice of methods shall preferably reflect national circumstances, the national importance of a source, and different resources to prepare inventories. A country may also apply a national methodology given that it is well documented and not in conflict with good practice. Emission data reported under the United Nation Framework Convention on Climate Change are subject to external control, and the methodologies are reviewed by experts on agricultural inventories. (au)
International Nuclear Information System (INIS)
Xu Yuhua; Zhou Wuneng; Fang Jianan
2009-01-01
This paper introduces a modified Lue chaotic system, and some basic dynamical properties are studied. Based on these properties, we present hybrid dislocated control method for stabilizing chaos to unstable equilibrium and limit cycle. In addition, based on the Lyapunov stability theorem, general hybrid projective dislocated synchronization (GHPDS) is proposed, which includes complete dislocated synchronization, dislocated anti-synchronization and projective dislocated synchronization as its special item. The drive and response systems discussed in this paper can be strictly different dynamical systems (including different dimensional systems). As examples, the modified Lue chaotic system, Chen chaotic system and hyperchaotic Chen system are discussed. Numerical simulations are given to show the effectiveness of these methods.
Energy Technology Data Exchange (ETDEWEB)
Xu Yuhua [College of Information Science and Technology, Donghua University, Shanghai 201620 (China) and Department of Maths, Yunyang Teacher' s College, Hubei 442000 (China)], E-mail: yuhuaxu2004@163.com; Zhou Wuneng [College of Information Science and Technology, Donghua University, Shanghai 201620 (China)], E-mail: wnzhou@163.com; Fang Jianan [College of Information Science and Technology, Donghua University, Shanghai 201620 (China)
2009-11-15
This paper introduces a modified Lue chaotic system, and some basic dynamical properties are studied. Based on these properties, we present hybrid dislocated control method for stabilizing chaos to unstable equilibrium and limit cycle. In addition, based on the Lyapunov stability theorem, general hybrid projective dislocated synchronization (GHPDS) is proposed, which includes complete dislocated synchronization, dislocated anti-synchronization and projective dislocated synchronization as its special item. The drive and response systems discussed in this paper can be strictly different dynamical systems (including different dimensional systems). As examples, the modified Lue chaotic system, Chen chaotic system and hyperchaotic Chen system are discussed. Numerical simulations are given to show the effectiveness of these methods.
Directory of Open Access Journals (Sweden)
V.V. Krivorotov
2008-06-01
Full Text Available The main tendencies of the waste products formation and recycling in the Russian Federation and in the Sverdlovsk region have been analyzed and the principal factors restraining the inclusion of anthropogenic formations into the economic circulation have been revealed in the work. A technical approach to the estimation of both ecological and economic integral efficiency of the recycling projects that, in autors, opinion, secures higher objectivity of this estimation as well as the validity of the made decisions on their realization.
Using Intelligent Techniques in Construction Project Cost Estimation: 10-Year Survey
Directory of Open Access Journals (Sweden)
Abdelrahman Osman Elfaki
2014-01-01
Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2014-01-01
For decades researchers have claimedthat particularly demand forecasts and construction cost estimations are assigned with/affected by a large degree of uncertainty. Massively, articles,research documents and reports agree that there exists a tendencytowards underestimating the costs...... in demand and cost estimations and hence the evaluation of transport infrastructure projects. Currently, research within this area is scarce and scattered with no commonagreement on how to embed and operationalise the huge amount of empiricaldata that exist within the frame of Optimism Bias. Therefore...... convertingdeterministic beneﬁt-cost ratios (BCRs) into stochasticinterval results. A new data collection (2009–2013) forms the empirical basis for any risk simulation embeddedwithin the so-calledUP database (UNITE project database),revealing the inaccuracy of both construction costs and demandforecasts. Accordingly...
Estimating customer electricity savings from projects installed by the U.S. ESCO industry
Energy Technology Data Exchange (ETDEWEB)
Carvallo, Juan Pablo [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Larsen, Peter H. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
2014-11-25
The U.S. energy service company (ESCO) industry has a well-established track record of delivering substantial energy and dollar savings in the public and institutional facilities sector, typically through the use of energy savings performance contracts (ESPC) (Larsen et al. 2012; Goldman et al. 2005; Hopper et al. 2005, Stuart et al. 2013). This ~$6.4 billion industry, which is expected to grow significantly over the next five years, may play an important role in achieving demand-side energy efficiency under local/state/federal environmental policy goals. To date, there has been little or no research in the public domain to estimate electricity savings for the entire U.S. ESCO industry. Estimating these savings levels is a foundational step in order to determine total avoided greenhouse gas (GHG) emissions from demand-side energy efficiency measures installed by U.S. ESCOs. We introduce a method to estimate the total amount of electricity saved by projects implemented by the U.S. ESCO industry using the Lawrence Berkeley National Laboratory (LBNL) /National Association of Energy Service Companies (NAESCO) database of projects and LBNL’s biennial industry survey. We report two metrics: incremental electricity savings and savings from ESCO projects that are active in a given year (e.g., 2012). Overall, we estimate that in 2012 active U.S. ESCO industry projects generated about 34 TWh of electricity savings—15 TWh of these electricity savings were for MUSH market customers who did not rely on utility customer-funded energy efficiency programs (see Figure 1). This analysis shows that almost two-thirds of 2012 electricity savings in municipal, local and state government facilities, universities/colleges, K-12 schools, and healthcare facilities (i.e., the so-called “MUSH” market) were not supported by a utility customer-funded energy efficiency program.
Chatzidakis, Stylianos; Liu, Zhengzhi; Hayward, Jason P.; Scaglione, John M.
2018-03-01
This work presents a generalized muon trajectory estimation algorithm to estimate the path of a muon in either uniform or nonuniform media. The use of cosmic ray muons in nuclear nonproliferation and safeguard verification applications has recently gained attention due to the non-intrusive and passive nature of the inspection, penetrating capabilities, as well as recent advances in detectors that measure position and direction of the individual muons before and after traversing the imaged object. However, muon image reconstruction techniques are limited in resolution due to low muon flux and the effects of multiple Coulomb scattering (MCS). Current reconstruction algorithms, e.g., point of closest approach (PoCA) or straight-line path (SLP), rely on overly simple assumptions for muon path estimation through the imaged object. For robust muon tomography, efficient and flexible physics-based algorithms are needed to model the MCS process and accurately estimate the most probable trajectory of a muon as it traverses an object. In the present work, the use of a Bayesian framework and a Gaussian approximation of MCS is explored for estimation of the most likely path of a cosmic ray muon traversing uniform or nonuniform media and undergoing MCS. The algorithm's precision is compared to Monte Carlo simulated muon trajectories. It was found that the algorithm is expected to be able to predict muon tracks to less than 1.5 mm root mean square (RMS) for 0.5 GeV muons and 0.25 mm RMS for 3 GeV muons, a 50% improvement compared to SLP and 15% improvement when compared to PoCA. Further, a 30% increase in useful muon flux was observed relative to PoCA. Muon track prediction improved for higher muon energies or smaller penetration depth where energy loss is not significant. The effect of energy loss due to ionization is investigated, and a linear energy loss relation that is easy to use is proposed.
van Zyl, J. Martin
2012-01-01
Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...
Rational Choice of the Investment Project Using Interval Estimates of the Initial Parameters
Directory of Open Access Journals (Sweden)
Kotsyuba Oleksiy S.
2016-11-01
Full Text Available The article is dedicated to the development of instruments to support decision-making on the problem of choosing the best investment project in a situation when initial quantitative parameters of the considered investment alternatives are described by interval estimates. In terms of managing the risk caused by interval uncertainty of the initial data, the study is limited to the component (aspect of risk measure as a degree of possibility of discrepancy between the resulting economic indicator (criterion and its normative level (the norm. An important hypothesis used as a basis for the proposed in the work formalization of the problem under consideration is the presence – for some or all of the projects from which the choice is made – of risk of poor rate of return in terms of net present (current value. Based upon relevant developments within the framework of the fuzzy-set methodology and interval analysis, there formulated a model for choosing an optimal investment project from the set of alternative options for the interval formulation of the problem. In this case it is assumed that indicators of economic attractiveness (performance of the compared directions of real investment are described either by interval estimates or possibility distribution functions. With the help of the estimated conditional example there implemented an approbation of the proposed model, which demonstrated its practical viability.
Generalized Projective Synchronization between Two Complex Networks with Time-Varying Coupling Delay
International Nuclear Information System (INIS)
Mei, Sun; Chang-Yan, Zeng; Li-Xin, Tian
2009-01-01
Generalized projective synchronization (GPS) between two complex networks with time-varying coupling delay is investigated. Based on the Lyapunov stability theory, a nonlinear controller and adaptive updated laws are designed. Feasibility of the proposed scheme is proven in theory. Moreover, two numerical examples are presented, using the energy resource system and Lü's system [Physica A 382 (2007) 672] as the nodes of the networks. GPS between two energy resource complex networks with time-varying coupling delay is achieved. This study can widen the application range of the generalized synchronization methods and will be instructive for the demand–supply of energy resource in some regions of China
International Nuclear Information System (INIS)
Niu Yu-Jun; Wang Xing-Yuan; Nian Fu-Zhong; Wang Ming-Jun
2010-01-01
Based on the stability theory of the fractional order system, the dynamic behaviours of a new fractional order system are investigated theoretically. The lowest order we found to have chaos in the new three-dimensional system is 2.46, and the period routes to chaos in the new fractional order system are also found. The effectiveness of our analysis results is further verified by numerical simulations and positive largest Lyapunov exponent. Furthermore, a nonlinear feedback controller is designed to achieve the generalized projective synchronization of the fractional order chaotic system, and its validity is proved by Laplace transformation theory. (general)
Generalized Projective Synchronization between Two Complex Networks with Time-Varying Coupling Delay
Sun, Mei; Zeng, Chang-Yan; Tian, Li-Xin
2009-01-01
Generalized projective synchronization (GPS) between two complex networks with time-varying coupling delay is investigated. Based on the Lyapunov stability theory, a nonlinear controller and adaptive updated laws are designed. Feasibility of the proposed scheme is proven in theory. Moreover, two numerical examples are presented, using the energy resource system and Lü's system [Physica A 382 (2007) 672] as the nodes of the networks. GPS between two energy resource complex networks with time-varying coupling delay is achieved. This study can widen the application range of the generalized synchronization methods and will be instructive for the demand-supply of energy resource in some regions of China.
International Nuclear Information System (INIS)
Kai, Michiaki; Kusama, Tomoko
1990-01-01
Lifetime cancer risk estimates depend on risk projection models. While the increasing lengths of follow-up observation periods of atomic bomb survivors in Hiroshima and Nagasaki bring about changes in cancer risk estimates, the validity of the two risk projection models, the additive risk projection model (AR) and multiplicative risk projection model (MR), comes into question. This paper compares the lifetime risk or loss of life-expectancy between the two projection models on the basis of BEIR-III report or recently published RERF report. With Japanese cancer statistics the estimates of MR were greater than those of AR, but a reversal of these results was seen when the cancer hazard function for India was used. When we investigated the validity of the two projection models using epidemiological human data and animal data, the results suggested that MR was superior to AR with respect to temporal change, but there was little evidence to support its validity. (author)
Directory of Open Access Journals (Sweden)
Tomáš Bayer
2017-03-01
Full Text Available Modern techniques for the map analysis allow for the creation of full or partial geometric reconstruction of its content. The projection is described by the set of estimated constant values: transformed pole position, standard parallel latitude, longitude of the central meridian, and a constant parameter. Analogously the analyzed map is represented by its constant values: auxiliary sphere radius, origin shifts, and angle of rotation. Several new methods denoted as M6-M9 for the estimation of an unknown map projection and its parameters differing in the number of determined parameters, reliability, robustness, and convergence have been developed. However, their computational demands are similar. Instead of directly measuring the dissimilarity of two projections, the analyzed map in an unknown projection and the image of the sphere in the well-known (i.e., analyzed projection are compared. Several distance functions for the similarity measurements based on the location as well as shape similarity approaches are proposed. An unconstrained global optimization problem poorly scaled, with large residuals, for the vector of unknown parameters is solved by the hybrid BFGS method. To avoid a slower convergence rate for small residual problems, it has the ability to switch between first- and second-order methods. Such an analysis is beneficial and interesting for historic, old, or current maps without information about the projection. Its importance is primarily referred to refinement of spatial georeference for the medium- and small-scale maps, analysis of the knowledge about the former world, analysis of the incorrectly/inaccurately drawn regions, and appropriate cataloging of maps. The proposed algorithms have been implemented in the new version of the detectproj software.
General-purpose heat source project and space nuclear safety and fuels program. Progress report
International Nuclear Information System (INIS)
Maraman, W.J.
1979-12-01
This formal monthly report covers the studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of the Los Alamos Scientific Laboratory. The two programs involved are general-purpose heat source development and space nuclear safety and fuels. Most of the studies discussed hear are of a continuing nature. Results and conclusions described may change as the work continues
A generalization of the Friedrichs angle and the method of alternating projections
Czech Academy of Sciences Publication Activity Database
Badea, C.; Grivaux, S.; Müller, Vladimír
2010-01-01
Roč. 348, 1-2 (2010), s. 53-56 ISSN 1631-073X R&D Projects: GA ČR(CZ) GA201/06/0128 Institutional research plan: CEZ:AV0Z10190503 Keywords : Hilbert space * Neumann -Halperin method Subject RIV: BA - General Mathematics Impact factor: 0.399, year: 2010 http://www.sciencedirect.com/science/article/pii/S1631073X09004002
Directory of Open Access Journals (Sweden)
Theocharis Theofanidis
2016-01-01
Full Text Available Real hypersurfaces satisfying the condition ϕl=lϕ(l=R(·,ξξ have been studied by many authors under at least one more condition, since the class of these hypersurfaces is quite tough to be classified. The aim of the present paper is the classification of real hypersurfaces in complex projective plane CP2 satisfying a generalization of ϕl=lϕ under an additional restriction on a specific function.
International Nuclear Information System (INIS)
1989-04-01
Since 1976, DOE preliminary investigations for a high level nuclear waste repository at Yucca Mountain, Nevada, have caused widespread disturbances of the landscape. This report addresses the areal extent of those disturbances that have accrued up to June 1988, and identifies expected associated reclamation costs. It was first necessary to identify disturbances, next to classify them for reclamation purposes, and, then, to assign general reclamation costs. The purposes of the analysis were: (1) to establish the amount of disturbance that already exists in the area of Yucca Mountain in order to identify alterations of the landscape that comprise the existing baseline conditions; (2) to identify estimated general reclamation costs for repair of the disturbances; (3) to provide information needed to establish disturbance models, and eventually environmental impact models, that can be applied to future DOE activities during Site Characterization and later phases of repository development, if they occur, and (4) to provide indicators of the needs for reclamation of future disturbances created by DOE's Site Characterization program. Disturbances were estimated using measurements from June 1988, large scale color aerial photography. Two reconnaissance site visits were also conducted. The identified disturbance totals by type are displayed in tabular form in the appendices. 84 refs., 2 figs., 9 tabs
International Nuclear Information System (INIS)
Doerfel, H.; Andrasi, A.; Bailey, M.; Blanchardon, E.; Cruz-Suarez, R.; Berkovski, V.; Castellani, C. M.; Hurtgenv, C.; Leguen, B.; Malatova, I.; Marsh, J.; Stather, J.; Zeger, J.
2007-01-01
In recent major international intercomparison exercises on intake and internal dose assessments from monitoring data, the results calculated by different participants varied significantly. Based on this experience the need for harmonisation of the procedures has been formulated within an EU 5. Framework Programme research project. The aim of the project, IDEAS, is to develop general guidelines for standardising assessments of intakes and internal doses. The IDEAS project started in October 2001 and ended in June 2005. The project is closely related to some goals of the work of Committee 2 of the ICRP and since 2003 there has been close cooperation between the two groups. To ensure that the guidelines are applicable to a wide range of practical situations, the first step was to compile a database of well-documented cases of internal contamination. In parallel, an improved version of an existing software package was developed and distributed to the partners for further use. A large number of cases from the database was evaluated independently by the partners and the results reviewed. Based on these evaluations, guidelines were drafted and discussed with dosimetry professionals from around the world by means of a virtual workshop on the Internet early in 2004. The guidelines have been revised and refined on the basis of the experiences and discussions in this virtual workshop. The general philosophy of the Guidelines is presented here, focusing on the principles of harmonisation, optimisation and proportionality. Finally, the proposed Levels of Task to structure the approach of internal dose evaluation are reported. (authors)
Directory of Open Access Journals (Sweden)
Pengfei Sun
Full Text Available Pose estimation aims at measuring the position and orientation of a calibrated camera using known image features. The pinhole model is the dominant camera model in this field. However, the imaging precision of this model is not accurate enough for an advanced pose estimation algorithm. In this paper, a new camera model, called incident ray tracking model, is introduced. More importantly, an advanced pose estimation algorithm based on the perspective ray in the new camera model, is proposed. The perspective ray, determined by two positioning points, is an abstract mathematical equivalent of the incident ray. In the proposed pose estimation algorithm, called perspective-ray-based scaled orthographic projection with iteration (PRSOI, an approximate ray-based projection is calculated by a linear system and refined by iteration. Experiments on the PRSOI have been conducted, and the results demonstrate that it is of high accuracy in the six degrees of freedom (DOF motion. And it outperforms three other state-of-the-art algorithms in terms of accuracy during the contrast experiment.
R&P: the multiple meaning of a research project in general practice.
Visentin, Giorgio
2005-08-01
Rischio e Prevenzione (Risk and Prevention) is a research project that is becoming the paradigm of the Italian research on General Practice. It started from a survey showing that treatment and control of cardiovascular risk is still far from optimal even in very high-risk patients. A group of general practitioners, coordinated by Istituto Mario Negri, wrote the protocol of the study with various proposals: Creating a research network. Building research infrastructure with good research capacity. Building a 'therapeutic alliance' with the patient while presenting the research, not only obtaining their signature for a 'bureaucratic' informed consent. Having the 'Collaborative Group' as the 'sponsor' of a research even if the funds are coming from Pharmaceutical Industry. It is a randomised controlled trial (RCT) carried out in primary care with the normal patient of our daily work, so transferability is very possible. The way to enroll the patients and the request to specify the reason for not joining the project of the outcome study are a kind of participatory research. The outcome study can become a model for implementing new strategy on cardiovascular risk. A specific questionnaire will enquire the different point of view of the patient and of the general practitioner/researcher. The result of this project will help us understand the phenomenon of the poor compliance of the high-risk patients. First results during enrollment allow some optimism.
Channel Selection and Feature Projection for Cognitive Load Estimation Using Ambulatory EEG
Directory of Open Access Journals (Sweden)
Tian Lan
2007-01-01
Full Text Available We present an ambulatory cognitive state classification system to assess the subject's mental load based on EEG measurements. The ambulatory cognitive state estimator is utilized in the context of a real-time augmented cognition (AugCog system that aims to enhance the cognitive performance of a human user through computer-mediated assistance based on assessments of cognitive states using physiological signals including, but not limited to, EEG. This paper focuses particularly on the offline channel selection and feature projection phases of the design and aims to present mutual-information-based techniques that use a simple sample estimator for this quantity. Analyses conducted on data collected from 3 subjects performing 2 tasks (n-back/Larson at 2 difficulty levels (low/high demonstrate that the proposed mutual-information-based dimensionality reduction scheme can achieve up to 94% cognitive load estimation accuracy.
Rock mass mechanical property estimations for the Yucca Mountain Site Characterization Project
International Nuclear Information System (INIS)
Lin, M.; Hardy, M.P.; Bauer, S.J.
1993-06-01
Rock mass mechanical properties are important in the design of drifts and ramps. These properties are used in evaluations of the impacts of thermomechanical loading of potential host rock within the Yucca Mountain Site Characterization Project. Representative intact rock and joint mechanical properties were selected for welded and nonwelded tuffs from the currently available data sources. Rock mass qualities were then estimated using both the Norwegian Geotechnical Institute (Q) and Geomechanics Rating (RMR) systems. Rock mass mechanical properties were developed based on estimates of rock mass quality, the current knowledge of intact properties, and fracture/joint characteristics. Empirical relationships developed to correlate the rock mass quality indices and the rock mass mechanical properties were then used to estimate the range of rock mass mechanical properties
Projection-based circular constrained state estimation and fusion over long-haul links
Energy Technology Data Exchange (ETDEWEB)
Liu, Qiang [ORNL; Rao, Nageswara S. [ORNL
2017-07-01
In this paper, we consider a scenario where sensors are deployed over a large geographical area for tracking a target with circular nonlinear constraints on its motion dynamics. The sensor state estimates are sent over long-haul networks to a remote fusion center for fusion. We are interested in different ways to incorporate the constraints into the estimation and fusion process in the presence of communication loss. In particular, we consider closed-form projection-based solutions, including rules for fusing the estimates and for incorporating the constraints, which jointly can guarantee timely fusion often required in realtime systems. We test the performance of these methods in the long-haul tracking environment using a simple example.
Generalized Hurst exponent estimates differentiate EEG signals of healthy and epileptic patients
Lahmiri, Salim
2018-01-01
The aim of our current study is to check whether multifractal patterns of the electroencephalographic (EEG) signals of normal and epileptic patients are statistically similar or different. In this regard, the generalized Hurst exponent (GHE) method is used for robust estimation of the multifractals in each type of EEG signals, and three powerful statistical tests are performed to check existence of differences between estimated GHEs from healthy control subjects and epileptic patients. The obtained results show that multifractals exist in both types of EEG signals. Particularly, it was found that the degree of fractal is more pronounced in short variations of normal EEG signals than in short variations of EEG signals with seizure free intervals. In contrary, it is more pronounced in long variations of EEG signals with seizure free intervals than in normal EEG signals. Importantly, both parametric and nonparametric statistical tests show strong evidence that estimated GHEs of normal EEG signals are statistically and significantly different from those with seizure free intervals. Therefore, GHEs can be efficiently used to distinguish between healthy and patients suffering from epilepsy.
Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems
Directory of Open Access Journals (Sweden)
Hagit Messer
2007-11-01
Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.
General guidelines for the Assessment of Internal Dose from Monitoring Data (Project IDEAS)
International Nuclear Information System (INIS)
Doerfel, H.; Andrasi, A.; Bailey, M.; Blanchardon, E.; Berkovski, V.; Castellani, C. M.; Hurtgen, C.; Jourdain, J. R.; LeGuen, B.; Puncher, M.
2004-01-01
In recent major international intercomparison exercises on intake and internal dose assessments from monitoring data the results calculated by different participants varied significantly. This was mainly due to the broad variety of methods and assumptions applied in the assessment procedure. Based on these experiences the need for harmonisation of the procedures has been formulated within an EU research project under the 5th Framework Programme. The aim of the project, IDEAS, is to develop general guidelines for standardising assessments of intakes and internal doses. The IDEAS project started in October 2001 and will end in March 2005. Eight institutions from seven European countries are participating. Inputs from internal dosimetry professionals from across Europe are also being used to ensure a broad consensus in the outcome of the project. The IDEAS project is closely related to some goals of the work of Committee 2 of the ICRP and since 2003 there has been close cooperation between the two groups. To ensure that the guidelines are applicable to a wide range of practical situations, the first step has been to compile a database of well-documented cases of internal contamination. In parallel, an improved version of an existing software package has been developed and distributed to the partners for further use. A large number of cases from the database have been evaluated independently by partners in the project using the same software and the results have been reviewed. Based on these evaluations guidelines are being drafted and will be discussed with dosimetry professionals from around the world by means of a virtual workshop on the Internet early in 2004. The guidelines will be revised and refined on the basis of the experiences and discussions of this virtual workshop and the outcome of an intercomparison exercise organised as part of the project. This will be open to all internal dosimetry professionals. (Author) 10 refs
Wang, Chaolong; Zhan, Xiaowei; Liang, Liming; Abecasis, Gonçalo R.; Lin, Xihong
2015-01-01
Accurate estimation of individual ancestry is important in genetic association studies, especially when a large number of samples are collected from multiple sources. However, existing approaches developed for genome-wide SNP data do not work well with modest amounts of genetic data, such as in targeted sequencing or exome chip genotyping experiments. We propose a statistical framework to estimate individual ancestry in a principal component ancestry map generated by a reference set of individuals. This framework extends and improves upon our previous method for estimating ancestry using low-coverage sequence reads (LASER 1.0) to analyze either genotyping or sequencing data. In particular, we introduce a projection Procrustes analysis approach that uses high-dimensional principal components to estimate ancestry in a low-dimensional reference space. Using extensive simulations and empirical data examples, we show that our new method (LASER 2.0), combined with genotype imputation on the reference individuals, can substantially outperform LASER 1.0 in estimating fine-scale genetic ancestry. Specifically, LASER 2.0 can accurately estimate fine-scale ancestry within Europe using either exome chip genotypes or targeted sequencing data with off-target coverage as low as 0.05×. Under the framework of LASER 2.0, we can estimate individual ancestry in a shared reference space for samples assayed at different loci or by different techniques. Therefore, our ancestry estimation method will accelerate discovery in disease association studies not only by helping model ancestry within individual studies but also by facilitating combined analysis of genetic data from multiple sources. PMID:26027497
Estimation and Projection of Prevalence of Colorectal Cancer in Iran, 2015–2020
Directory of Open Access Journals (Sweden)
Hossein Molavi Vardanjani
2018-01-01
Full Text Available Background: Population aging and more prevalent westernized lifestyle would be expected to result in a markedly rising burden of colorectal cancer (CRC in the future years. The aim of this study is to estimate the limited-time prevalence of CRC in Iran between 2015 and 2020. Materials and Methods: Aggregated CRC incidence data were extracted from the Iranian national cancer registry (IR.NCR reports for 2003–2009 and from GLOBOCAN-2012 database for 2012. Incidence trends were analyzed by age groups, genders, histopathologic, and topographic subtypes to estimate annual percentage changes. Incidence was projected for 2020. The prevalence was estimated applying an adopted version of a previously introduced equation to estimate limited–time prevalence based on the incidence and survival data. Monte Carlo sensitivity analyses were applied to estimate 95% uncertainty levels (ULs. In each scenario, incidence, survival, annual percentage changes, and completeness of case ascertainment at IR.NCR were replaced under pre-assumed distributions. Results: Number of estimated within 1, 2-3 and 4-5-year CRC patients in 2015 were 13676 (95% UL: 10051–18807, 20964 (15835–28268, and 14485 (11188–19293, respectively. Estimated 5-year prevalence for 2020 (99463; 75150–134744 was 2.03 times of that for 2015. Highest 5-year prevalence was estimated in ages 55–59 for females and 75 + for males. Adenocarcinoma (41376; 31227 55898 was the most prevalent histologic subtype. The most prevalent tumor location was colon (30822, 23262–41638. Conclusion: A substantial growth in the prevalence of CRC survivors is highly expected for future years in Iran. Establishment of specialized institutes is highly recommended to provide medical and especially social supports for Iranian CRC survivors.
Estimation and Projection of Prevalence of Colorectal Cancer in Iran, 2015-2020.
Vardanjani, Hossein Molavi; Haghdoost, AliAkbar; Bagheri-Lankarani, Kamran; Hadipour, Maryam
2018-01-01
Population aging and more prevalent westernized lifestyle would be expected to result in a markedly rising burden of colorectal cancer (CRC) in the future years. The aim of this study is to estimate the limited-time prevalence of CRC in Iran between 2015 and 2020. Aggregated CRC incidence data were extracted from the Iranian national cancer registry (IR.NCR) reports for 2003-2009 and from GLOBOCAN-2012 database for 2012. Incidence trends were analyzed by age groups, genders, histopathologic, and topographic subtypes to estimate annual percentage changes. Incidence was projected for 2020. The prevalence was estimated applying an adopted version of a previously introduced equation to estimate limited-time prevalence based on the incidence and survival data. Monte Carlo sensitivity analyses were applied to estimate 95% uncertainty levels (ULs). In each scenario, incidence, survival, annual percentage changes, and completeness of case ascertainment at IR.NCR were replaced under pre-assumed distributions. Number of estimated within 1, 2-3 and 4-5-year CRC patients in 2015 were 13676 (95% UL: 10051-18807), 20964 (15835-28268), and 14485 (11188-19293), respectively. Estimated 5-year prevalence for 2020 (99463; 75150-134744) was 2.03 times of that for 2015. Highest 5-year prevalence was estimated in ages 55-59 for females and 75 + for males. Adenocarcinoma (41376; 31227 55898) was the most prevalent histologic subtype. The most prevalent tumor location was colon (30822, 23262-41638). A substantial growth in the prevalence of CRC survivors is highly expected for future years in Iran. Establishment of specialized institutes is highly recommended to provide medical and especially social supports for Iranian CRC survivors.
Estimation and Projection of Prevalence of Colorectal Cancer in Iran, 2015–2020
Vardanjani, Hossein Molavi; Haghdoost, AliAkbar; Bagheri-Lankarani, Kamran; Hadipour, Maryam
2018-01-01
Background: Population aging and more prevalent westernized lifestyle would be expected to result in a markedly rising burden of colorectal cancer (CRC) in the future years. The aim of this study is to estimate the limited-time prevalence of CRC in Iran between 2015 and 2020. Materials and Methods: Aggregated CRC incidence data were extracted from the Iranian national cancer registry (IR.NCR) reports for 2003–2009 and from GLOBOCAN-2012 database for 2012. Incidence trends were analyzed by age groups, genders, histopathologic, and topographic subtypes to estimate annual percentage changes. Incidence was projected for 2020. The prevalence was estimated applying an adopted version of a previously introduced equation to estimate limited–time prevalence based on the incidence and survival data. Monte Carlo sensitivity analyses were applied to estimate 95% uncertainty levels (ULs). In each scenario, incidence, survival, annual percentage changes, and completeness of case ascertainment at IR.NCR were replaced under pre-assumed distributions. Results: Number of estimated within 1, 2-3 and 4-5-year CRC patients in 2015 were 13676 (95% UL: 10051–18807), 20964 (15835–28268), and 14485 (11188–19293), respectively. Estimated 5-year prevalence for 2020 (99463; 75150–134744) was 2.03 times of that for 2015. Highest 5-year prevalence was estimated in ages 55–59 for females and 75 + for males. Adenocarcinoma (41376; 31227 55898) was the most prevalent histologic subtype. The most prevalent tumor location was colon (30822, 23262–41638). Conclusion: A substantial growth in the prevalence of CRC survivors is highly expected for future years in Iran. Establishment of specialized institutes is highly recommended to provide medical and especially social supports for Iranian CRC survivors. PMID:29456991
International Nuclear Information System (INIS)
Iizumi, T.; Nishimori, M.; Yokozawa, M.
2008-01-01
For this study, we developed a new statistical model to estimate the daily accumulated global solar radiation on the earth's surface and used the model to generate a high-resolution climate change scenario of the radiation field in Japan. The statistical model mainly relies on precipitable water vapor calculated from air temperature and relative humidity on the surface to estimate seasonal changes in global solar radiation. On the other hand, to estimate daily radiation fluctuations, the model uses either a diurnal temperature range or relative humidity. The diurnal temperature range, calculated from the daily maximum and minimum temperatures, and relative humidity is a general output of most climate models, and pertinent observation data are comparatively easy to access. The statistical model performed well when estimating the monthly mean value, daily fluctuation statistics, and regional differences in the radiation field in Japan. To project the change in the radiation field for the years 2081 to 2100, we applied the statistical model to the climate change scenario of a high-resolution Regional Climate Model with a 20-km mesh size (RCM20) developed at the Meteorological Research Institute based on the Special Report for Emission Scenario (SRES)-A2. The projected change shows the following tendency: global solar radiation will increase in the warm season and decrease in the cool season in many areas of Japan, indicating that global warming may cause changes in the radiation field in Japan. The generated climate change scenario for the radiation field is linked to long-term and short-term changes in air temperature and relative humidity obtained from the RCM20 and, consequently, is expected to complement the RCM20 datasets for an impact assessment study in the agricultural sector
Comparing avian and bat fatality rate estimates among North American wind energy projects
Energy Technology Data Exchange (ETDEWEB)
Smallwood, Shawn
2011-07-01
Full text: Wind energy development has expanded rapidly, and so have concerns over bird and bat impacts caused by wind turbines. To assess and compare impacts due to collisions, investigators use a common metric, fatalities/MW/year, but estimates of fatality rates have come from various wind turbine models, tower heights, environments, fatality search methods, and analytical methods. To improve comparability and asses large-scale impacts, I applied a common set of assumptions and methods to data in fatality monitoring reports to estimate fatality rates of birds and bats at 71 wind projects across North America (52 outside the Altamont Pass Wind Resource Area, APWRA). The data were from wind turbines of 27 sizes (range 0.04-3.00 MW) and 28 tower heights (range 18.5-90 m), and searched at 40 periodic intervals (range 1-90 days) and out to 20 distances from turbines (range 30-126 m). Estimates spanned the years 1982 to 2010, and involved 1-1,345 turbines per unique combination of project, turbine size, tower height, and search methodology. I adjusted fatality rates for search detection rates averaged from 425 detection trials, and for scavenger removal rates based on 413 removal trials. I also adjusted fatality rates for turbine tower height and maximum search radius, based on logistic functions fit to cumulative counts of carcasses that were detected at 1-m distance intervals from the turbine. For each tower height, I estimated the distance at which cumulative carcass counts reached an asymptote, and for each project I calculated the proportion of fatalities likely not found due to the maximum search radius being short of the model-predicted distance asymptote. I used the same estimator in all cases. I estimated mean fatalities/MW/year among North American wind projects at 12.6 bats (80% CI: 8.1-17.1) and 11.1 birds (80% CI: 9.5-12.7), including 1.6 raptors (80% CI: 1.3-2.0), and excluding the Altamont Pass I estimated fatality rates at 17.2 bats (80% CI: 9
A Novel Method Based on Oblique Projection Technology for Mixed Sources Estimation
Directory of Open Access Journals (Sweden)
Weijian Si
2014-01-01
Full Text Available Reducing the computational complexity of the near-field sources and far-field sources localization algorithms has been considered as a serious problem in the field of array signal processing. A novel algorithm caring for mixed sources location estimation based on oblique projection is proposed in this paper. The sources are estimated at two different stages and the sensor noise power is estimated and eliminated from the covariance which improve the accuracy of the estimation of mixed sources. Using the idea of compress, the range information of near-field sources is obtained by searching the partial area instead of the whole Fresnel area which can reduce the processing time. Compared with the traditional algorithms, the proposed algorithm has the lower computation complexity and has the ability to solve the two closed-spaced sources with high resolution and accuracy. The duplication of range estimation is also avoided. Finally, simulation results are provided to demonstrate the performance of the proposed method.
Energy Technology Data Exchange (ETDEWEB)
Russell, E.W.; Clarke, W. [Lawrence Livermore National Lab., CA (United States); Domian, H.A. [Babcock and Wilcox Co., Lynchburg, VA (United States); Madson, A.A. [Kaiser Engineers California Corp., Oakland, CA (United States)
1991-08-01
This report summarizes the bottoms-up cost estimates for fabrication of high-level radioactive waste disposal containers based on the Site Characterization Plan Conceptual Design (SCP-CD). These estimates were acquired by Babcock and Wilcox (B&S) under sub-contract to Lawrence Livermore National Laboratory (LLNL) for the Yucca Mountain Site Characterization Project (YMP). The estimates were obtained for two leading container candidate materials (Alloy 825 and CDA 715), and from other three vendors who were selected from a list of twenty solicited. Three types of container designs were analyzed that represent containers for spent fuel, and for vitrified high-level waste (HLW). The container internal structures were assumed to be AISI-304 stainless steel in all cases, with an annual production rate of 750 containers. Subjective techniques were used for estimating QA/QC costs based on vendor experience and the specifications derived for the LLNL-YMP Quality Assurance program. In addition, an independent QA/QC analysis is reported which was prepared by Kasier Engineering. Based on the cost estimates developed, LLNL recommends that values of $825K and $62K be used for the 1991 TSLCC for the spent fuel and HLW containers, respectively. These numbers represent the most conservative among the three vendors, and are for the high-nickel anstenitic steel (Alloy 825). 6 refs., 7 figs.
International Nuclear Information System (INIS)
Russell, E.W.; Clarke, W.; Domian, H.A.; Madson, A.A.
1991-08-01
This report summarizes the bottoms-up cost estimates for fabrication of high-level radioactive waste disposal containers based on the Site Characterization Plan Conceptual Design (SCP-CD). These estimates were acquired by Babcock and Wilcox (B ampersand S) under sub-contract to Lawrence Livermore National Laboratory (LLNL) for the Yucca Mountain Site Characterization Project (YMP). The estimates were obtained for two leading container candidate materials (Alloy 825 and CDA 715), and from other three vendors who were selected from a list of twenty solicited. Three types of container designs were analyzed that represent containers for spent fuel, and for vitrified high-level waste (HLW). The container internal structures were assumed to be AISI-304 stainless steel in all cases, with an annual production rate of 750 containers. Subjective techniques were used for estimating QA/QC costs based on vendor experience and the specifications derived for the LLNL-YMP Quality Assurance program. In addition, an independent QA/QC analysis is reported which was prepared by Kasier Engineering. Based on the cost estimates developed, LLNL recommends that values of $825K and $62K be used for the 1991 TSLCC for the spent fuel and HLW containers, respectively. These numbers represent the most conservative among the three vendors, and are for the high-nickel anstenitic steel (Alloy 825). 6 refs., 7 figs
International Nuclear Information System (INIS)
Stripling, H.F.; Anitescu, M.; Adams, M.L.
2013-01-01
variable. We apply the framework to two problems: a simple pendulum problem with known solution, which serves to verify our global error estimation procedure, and a 1D model of a traveling wave reactor. We then emphasize the utility of the framework by adding heat transfer physics to the reactor problem and showing that no reformulation of the adjoint framework is required for application to the new physics. We conclude that the abstraction of the adjoint approach into a general framework will facilitate multiphysics reactor modeling in large-scale software projects.
International Nuclear Information System (INIS)
1983-01-01
On a country by country basis the International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983 are reviewed. Information provided includes exploration work, airborne survey, radiometric survey, gamma-ray spectrometric survey, estimate of speculative resources, uranium occurrences, uranium deposits, uranium mineralization, agreements for uranium exploration, feasibilities studies, geological classification of resources, proposed revised resource range, production estimate of uranium
International Nuclear Information System (INIS)
Farivar, Faezeh; Aliyari Shoorehdeli, Mahdi; Nekoui, Mohammad Ali; Teshnehlab, Mohammad
2012-01-01
Highlights: ► A systematic procedure for GPS of unknown heavy chaotic gyroscope systems. ► Proposed methods are based on Lyapunov stability theory. ► Without calculating Lyapunov exponents and Eigen values of the Jacobian matrix. ► Capable to extend for a variety of chaotic systems. ► Useful for practical applications in the future. - Abstract: This paper proposes the chaos control and the generalized projective synchronization methods for heavy symmetric gyroscope systems via Gaussian radial basis adaptive variable structure control. Because of the nonlinear terms of the gyroscope system, the system exhibits chaotic motions. Occasionally, the extreme sensitivity to initial states in a system operating in chaotic mode can be very destructive to the system because of unpredictable behavior. In order to improve the performance of a dynamic system or avoid the chaotic phenomena, it is necessary to control a chaotic system with a periodic motion beneficial for working with a particular condition. As chaotic signals are usually broadband and noise like, synchronized chaotic systems can be used as cipher generators for secure communication. This paper presents chaos synchronization of two identical chaotic motions of symmetric gyroscopes. In this paper, the switching surfaces are adopted to ensure the stability of the error dynamics in variable structure control. Using the neural variable structure control technique, control laws are established which guarantees the chaos control and the generalized projective synchronization of unknown gyroscope systems. In the neural variable structure control, Gaussian radial basis functions are utilized to on-line estimate the system dynamic functions. Also, the adaptation laws of the on-line estimator are derived in the sense of Lyapunov function. Thus, the unknown gyro systems can be guaranteed to be asymptotically stable. Also, the proposed method can achieve the control objectives. Numerical simulations are presented to
Directory of Open Access Journals (Sweden)
Klusák J.
2009-12-01
Full Text Available The study of bi-material notches becomes a topical problem as they can model efficiently geometrical or material discontinuities. When assessing crack initiation conditions in the bi-material notches, the generalized stress intensity factors H have to be calculated. Contrary to the determination of the K-factor for a crack in an isotropic homogeneous medium, for the ascertainment of the H-factor there is no procedure incorporated in the calculation systems. The calculation of these fracture parameters requires experience. Direct methods of estimation of H-factors need choosing usually length parameter entering into calculation. On the other hand the method combining the application of the reciprocal theorem (Ψ-integral and FEM does not require entering any length parameter and is capable to extract the near-tip information directly from the far-field deformation.
A general approach for the estimation of loss of life due to natural and technological disasters
International Nuclear Information System (INIS)
Jonkman, S.N.; Lentz, A.; Vrijling, J.K.
2010-01-01
In assessing the safety of engineering systems in the context of quantitative risk analysis one of the most important consequence types concerns the loss of life due to accidents and disasters. In this paper, a general approach for loss of life estimation is proposed which includes three elements: (1) the assessment of physical effects associated with the event; (2) determination of the number of exposed persons (taking into account warning and evacuation); and (3) determination of mortality amongst the population exposed. The typical characteristics of and modelling approaches for these three elements are discussed. This paper focuses on 'small probability-large consequences' events within the engineering domain. It is demonstrated how the proposed approach can be applied to various case studies, such as tunnel fires, earthquakes and flood events.
FEATURES OF AN ESTIMATION OF INVESTMENT PROJECTS AT THE ENTERPRISES OF AVIATION INSTRUMENT
Directory of Open Access Journals (Sweden)
Petr P. Dobrov
2016-01-01
Full Text Available The relevance of this study due to the fact that the current situation in Russia is complemented by the negative effects of market reforms in the economy and economic sanctions adopted against our country and in particular the different level companies. In view of this, to effectively manage the activities and the development of aviation instrument companies and enterprises of different ownership forms are highly relevant issues related to the assessment of investment projects. The general crisis that engulfed almost all industry in Russia, demanded the application of a new ideology of the organization and management of investment projects, as well as their assessment at the enterprises of aviation instrument. In Russia, began a new stage in the development of project management establishment of a domestic methodology, complex tools and training for professional project management on the basis of domestic achievements, global experience and creativity of its processing based on the actual conditions of our country. The need for the use of project management methodology in Russia is determined by two factors: the increasing complexity of projects and the organizations that operate them, and the fact that project management is widely used in countries with market economies. Projects at the enterprises of aviation instrument making and evaluation are characterized by complexity and uncertainty, a significant dependence on the dynamic environment, including socio-economic, political, financial, economic, legislative influence of both the state and competing companies. In this paper, a study of modern methods of evaluating investment projects at the enterprises of aviation instrument. Methodology. The methodological basis of this paper appeared comparative and economic-mathematical analysis methods. Results. As part of the presentation of the present article the author, it was found that the activity of modern companies is not linear and is
Scoring the Icecap-A Capability Instrument. Estimation of a UK General Population Tariff†
Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna
2015-01-01
This paper reports the results of a best–worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that ‘attachment’ and ‘stability’ each account for around 22% of the space, and ‘autonomy’, ‘achievement’ and ‘enjoyment’ account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24254584
Scoring the Icecap-a capability instrument. Estimation of a UK general population tariff.
Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna
2015-03-01
This paper reports the results of a best-worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that 'attachment' and 'stability' each account for around 22% of the space, and 'autonomy', 'achievement' and 'enjoyment' account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd.
Mamo, C; Farina, E; Cicio, R; Fanì, M
2014-01-01
The aim of the study was to obtain local estimates of the prevalence of anxiety and dysthymic disorders among attendees of primary care at local level, useful to pursue a better management of the health care services. The study was conducted in the Health District no. 2 of Turin (industrial town in northwest Italy). The criteria for identification of cases were based on the drugs prescriptions made by general practitioners (GPs), selected in order to assure high specificity. The study involved 86 physicians (with 87,885 attendees). As expected, the crude and standardized prevalences were higher in women (anxiety: 2.9% vs 1.3% in men; dysthymia: 3.8% vs 1.7% in men), with a peak in women aged over 75 yrs (anxiety: 4.8%; dysthymia: 6.2%). In comparison to male GPs, female GPs had an higher prevalence of patients with anxious disorders, whereas the prevalences of dysthymia were similar. Despite the discussed limitations, the used methodology allows to obtain sufficiently reliable estimates of prevalence of common mental disorders at local level, providing informations useful for organizing the primary care in the Health district.
Wang, Z.
2015-12-01
For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.
Directory of Open Access Journals (Sweden)
Ana Calabrese
2011-01-01
Full Text Available In the auditory system, the stimulus-response properties of single neurons are often described in terms of the spectrotemporal receptive field (STRF, a linear kernel relating the spectrogram of the sound stimulus to the instantaneous firing rate of the neuron. Several algorithms have been used to estimate STRFs from responses to natural stimuli; these algorithms differ in their functional models, cost functions, and regularization methods. Here, we characterize the stimulus-response function of auditory neurons using a generalized linear model (GLM. In this model, each cell's input is described by: 1 a stimulus filter (STRF; and 2 a post-spike filter, which captures dependencies on the neuron's spiking history. The output of the model is given by a series of spike trains rather than instantaneous firing rate, allowing the prediction of spike train responses to novel stimuli. We fit the model by maximum penalized likelihood to the spiking activity of zebra finch auditory midbrain neurons in response to conspecific vocalizations (songs and modulation limited (ml noise. We compare this model to normalized reverse correlation (NRC, the traditional method for STRF estimation, in terms of predictive power and the basic tuning properties of the estimated STRFs. We find that a GLM with a sparse prior predicts novel responses to both stimulus classes significantly better than NRC. Importantly, we find that STRFs from the two models derived from the same responses can differ substantially and that GLM STRFs are more consistent between stimulus classes than NRC STRFs. These results suggest that a GLM with a sparse prior provides a more accurate characterization of spectrotemporal tuning than does the NRC method when responses to complex sounds are studied in these neurons.
Mechanism design of reverse auction on concession period and generalized quality for PPP projects
Institute of Scientific and Technical Information of China (English)
Xianjia WANG; Shiwei WU
2017-01-01
Reverse auctions of PPP projects usually require the bid to specify several characteristics of quality and the concession period to be fulfilled.This paper sets up a summary function of generalized quality,which contributes to reducing the dimensions of information.Thus,the multidimensional reverse auction model of a PPP project can be replaced by a two-dimensional direct mechanism based on the concession period and the generalized quality.Based on the theory of the revelation principle,the feasibility conditions,equilibrium solution and generalized quality requirements of such a mechanism,considering the influence of a variable investment structure are described.Moreover,two feasible multidimensional reverse auctions for implementing such a direct mechanism:Adjusting the scoring function and establishing a special reverse auction rule are built.The analysis shows that in these types of reverse auctions,optimal allocation can be achieved,the social benefit under the incomplete information will be maximized,and the private sector with the highest integrated management level wins the bid.In such a direct mechanism,the investment and financial pressure of the public sector can be reduced.
Westgate, Philip M
2016-01-01
When generalized estimating equations (GEE) incorporate an unstructured working correlation matrix, the variances of regression parameter estimates can inflate due to the estimation of the correlation parameters. In previous work, an approximation for this inflation that results in a corrected version of the sandwich formula for the covariance matrix of regression parameter estimates was derived. Use of this correction for correlation structure selection also reduces the over-selection of the unstructured working correlation matrix. In this manuscript, we conduct a simulation study to demonstrate that an increase in variances of regression parameter estimates can occur when GEE incorporates structured working correlation matrices as well. Correspondingly, we show the ability of the corrected version of the sandwich formula to improve the validity of inference and correlation structure selection. We also study the relative influences of two popular corrections to a different source of bias in the empirical sandwich covariance estimator.
Implicit Knowledge of General Upper Secondary School in a Bridge-building Project
DEFF Research Database (Denmark)
Rasmussen, Annette; Andreasen, Karen Egedal
2016-01-01
Bridge-building activities are practiced widely in the education systems of Europe. They are meant to bridge transitions between lower and upper secondary school and form a mandatory part of the youth guidance system in Denmark. By giving pupils the opportunity to experience the different...... educational context of upper secondary school, bridge-building activities are meant to facilitate their decision-making on educational paths, but also to attract more and new groups of pupils. However, the premises of the inherent differences of educational contexts and of pupils’ lacking knowledge of upper...... secondary education can be questioned. In this ethnographic case study of a bridge-building project in a rural area in Denmark, we analyse the implicit knowledge of the general upper secondary school, as it is practiced in a bridge-building project, and how it is experienced by the pupils on the background...
Directory of Open Access Journals (Sweden)
Thomson Peter C
2003-05-01
Full Text Available Abstract To date, most statistical developments in QTL detection methodology have been directed at continuous traits with an underlying normal distribution. This paper presents a method for QTL analysis of non-normal traits using a generalized linear mixed model approach. Development of this method has been motivated by a backcross experiment involving two inbred lines of mice that was conducted in order to locate a QTL for litter size. A Poisson regression form is used to model litter size, with allowances made for under- as well as over-dispersion, as suggested by the experimental data. In addition to fixed parity effects, random animal effects have also been included in the model. However, the method is not fully parametric as the model is specified only in terms of means, variances and covariances, and not as a full probability model. Consequently, a generalized estimating equations (GEE approach is used to fit the model. For statistical inferences, permutation tests and bootstrap procedures are used. This method is illustrated with simulated as well as experimental mouse data. Overall, the method is found to be quite reliable, and with modification, can be used for QTL detection for a range of other non-normally distributed traits.
Projection-based Bayesian recursive estimation of ARX model with uniform innovations
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav; Pavelková, Lenka
2007-01-01
Roč. 56, 9/10 (2007), s. 646-655 ISSN 0167-6911 R&D Projects: GA AV ČR 1ET100750401; GA MŠk 2C06001; GA MDS 1F43A/003/120 Institutional research plan: CEZ:AV0Z10750506 Keywords : ARX model * Bayesian recursive estimation * Uniform distribution Subject RIV: BC - Control Systems Theory Impact factor: 1.634, year: 2007 http://dx.doi.org/10.1016/j.sysconle.2007.03.005
Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig
2008-01-01
The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.
A technique for estimating 4D-CBCT using prior knowledge and limited-angle projections
International Nuclear Information System (INIS)
Zhang, You; Yin, Fang-Fang; Ren, Lei; Segars, W. Paul
2013-01-01
Purpose: To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy.Methods: Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes to the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and “ground-truth” onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy.Results: For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)/COMS (±S.D.) between lesions in prior images and “ground-truth” onboard images were 136.11% (±42.76%)/15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD/COMS between the lesion
Critical analysis of the Hanford spent nuclear fuel project activity based cost estimate
Energy Technology Data Exchange (ETDEWEB)
Warren, R.N.
1998-09-29
In 1997, the SNFP developed a baseline change request (BCR) and submitted it to DOE-RL for approval. The schedule was formally evaluated to have a 19% probability of success [Williams, 1998]. In December 1997, DOE-RL Manager John Wagoner approved the BCR contingent upon a subsequent independent review of the new baseline. The SNFP took several actions during the first quarter of 1998 to prepare for the independent review. The project developed the Estimating Requirements and Implementation Guide [DESH, 1998] and trained cost account managers (CAMS) and other personnel involved in the estimating process in activity-based cost (ABC) estimating techniques. The SNFP then applied ABC estimating techniques to develop the basis for the December Baseline (DB) and documented that basis in Basis of Estimate (BOE) books. These BOEs were provided to DOE in April 1998. DOE commissioned Professional Analysis, Inc. (PAI) to perform a critical analysis (CA) of the DB. PAI`s review formally began on April 13. PAI performed the CA, provided three sets of findings to the SNFP contractor, and initiated reconciliation meetings. During the course of PAI`s review, DOE directed the SNFP to develop a new baseline with a higher probability of success. The contractor transmitted the new baseline, which is referred to as the High Probability Baseline (HPB), to DOE on April 15, 1998 [Williams, 1998]. The HPB was estimated to approach a 90% confidence level on the start of fuel movement [Williams, 1998]. This high probability resulted in an increased cost and a schedule extension. To implement the new baseline, the contractor initiated 26 BCRs with supporting BOES. PAI`s scope was revised on April 28 to add reviewing the HPB and the associated BCRs and BOES.
Phillips, N.; Crosson, E.; Down, A.; Hutyra, L.; Jackson, R. B.; McKain, K.; Rella, C.; Raciti, S. M.; Wofsy, S. C.
2012-12-01
Lost and unaccounted natural gas can amount to over 6% of Massachusetts' total annual greenhouse gas inventory (expressed as equivalent CO2 tonnage). An unknown portion of this loss is due to natural gas leaks in pipeline distribution systems. The objective of the Boston Methane Project is to estimate the overall leak rate from natural gas systems in metropolitan Boston, and to compare this flux with fluxes from the other primary methane emissions sources. Companion talks at this meeting describe the atmospheric measurement and modeling framework, and chemical and isotopic tracers that can partition total atmospheric methane flux into natural gas and non-natural gas components. This talk focuses on estimation of surface emissions that inform the atmospheric modeling and partitioning. These surface emissions include over 3,300 pipeline natural gas leaks in Boston. For the state of Massachusetts as a whole, the amount of natural gas reported as lost and unaccounted for by utility companies was greater than estimated landfill emissions by an order of magnitude. Moreover, these landfill emissions were overwhelmingly located outside of metro Boston, while gas leaks are concentrated in exactly the opposite pattern, increasing from suburban Boston toward the urban core. Work is in progress to estimate spatial distribution of methane emissions from wetlands and sewer systems. We conclude with a description of how these spatial data sets will be combined and represented for application in atmospheric modeling.
A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.
Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa
2016-05-17
Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.
Generalized Projective Synchronization between Two Different Neural Networks with Mixed Time Delays
Directory of Open Access Journals (Sweden)
Xuefei Wu
2012-01-01
Full Text Available The generalized projective synchronization (GPS between two different neural networks with nonlinear coupling and mixed time delays is considered. Several kinds of nonlinear feedback controllers are designed to achieve GPS between two different such neural networks. Some results for GPS of these neural networks are proved theoretically by using the Lyapunov stability theory and the LaSalle invariance principle. Moreover, by comparison, we determine an optimal nonlinear controller from several ones and provide an adaptive update law for it. Computer simulations are provided to show the effectiveness and feasibility of the proposed methods.
International Nuclear Information System (INIS)
Maraman, W.J.
1980-05-01
This formal monthly report covers the studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of the Los Alamos Scientific Laboratory. The two programs involved are: General-Purpose Heat Source Development and Space Nuclear Safety and Fuels. Most of the studies discussed here are of a continuing nature. Results and conclusions described may change as the work continues. Published reference to the results cited in this report should not be made without the explicit permission of the person in charge of the work
International Nuclear Information System (INIS)
Saito, Reiko; Uemura, Koji; Uchiyama, Akihiko; Toyama, Hinako; Ishii, Kenji; Senda, Michio
2001-01-01
The purpose of this paper is to estimate the extent of atrophy and the decline in brain function objectively and quantitatively. Two-dimensional (2D) projection images of three-dimensional (3D) transaxial images of positron emission tomography (PET) and magnetic resonance imaging (MRI) were made by means of the Mollweide method which keeps the area of the brain surface. A correlation image was generated between 2D projection images of MRI and cerebral blood flow (CBF) or 18 F-fluorodeoxyglucose (FDG) PET images and the sulcus was extracted from the correlation image clustered by K-means method. Furthermore, the extent of atrophy was evaluated from the extracted sulcus on 2D-projection MRI and the cerebral cortical function such as blood flow or glucose metabolic rate was assessed in the cortex excluding sulcus on 2D-projection PET image, and then the relationship between the cerebral atrophy and function was evaluated. This method was applied to the two groups, the young and the aged normal subjects, and the relationship between the age and the rate of atrophy or the cerebral blood flow was investigated. This method was also applied to FDG-PET and MRI studies in the normal controls and in patients with corticobasal degeneration. The mean rate of atrophy in the aged group was found to be higher than that in the young. The mean value and the variance of the cerebral blood flow for the young are greater than those of the aged. The sulci were similarly extracted using either CBF or FDG PET images. The purposed method using 2-D projection images of MRI and PET is clinically useful for quantitative assessment of atrophic change and functional disorder of cerebral cortex. (author)
2014-01-01
We propose a smooth approximation l 0-norm constrained affine projection algorithm (SL0-APA) to improve the convergence speed and the steady-state error of affine projection algorithm (APA) for sparse channel estimation. The proposed algorithm ensures improved performance in terms of the convergence speed and the steady-state error via the combination of a smooth approximation l 0-norm (SL0) penalty on the coefficients into the standard APA cost function, which gives rise to a zero attractor that promotes the sparsity of the channel taps in the channel estimation and hence accelerates the convergence speed and reduces the steady-state error when the channel is sparse. The simulation results demonstrate that our proposed SL0-APA is superior to the standard APA and its sparsity-aware algorithms in terms of both the convergence speed and the steady-state behavior in a designated sparse channel. Furthermore, SL0-APA is shown to have smaller steady-state error than the previously proposed sparsity-aware algorithms when the number of nonzero taps in the sparse channel increases. PMID:24790588
A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania
Directory of Open Access Journals (Sweden)
Merger Eduard
2012-08-01
Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to
A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania
2012-01-01
Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs) ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV), and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to provide information on the
Camera-pose estimation via projective Newton optimization on the manifold.
Sarkis, Michel; Diepold, Klaus
2012-04-01
Determining the pose of a moving camera is an important task in computer vision. In this paper, we derive a projective Newton algorithm on the manifold to refine the pose estimate of a camera. The main idea is to benefit from the fact that the 3-D rigid motion is described by the special Euclidean group, which is a Riemannian manifold. The latter is equipped with a tangent space defined by the corresponding Lie algebra. This enables us to compute the optimization direction, i.e., the gradient and the Hessian, at each iteration of the projective Newton scheme on the tangent space of the manifold. Then, the motion is updated by projecting back the variables on the manifold itself. We also derive another version of the algorithm that employs homeomorphic parameterization to the special Euclidean group. We test the algorithm on several simulated and real image data sets. Compared with the standard Newton minimization scheme, we are now able to obtain the full numerical formula of the Hessian with a 60% decrease in computational complexity. Compared with Levenberg-Marquardt, the results obtained are more accurate while having a rather similar complexity.
A Projection free method for Generalized Eigenvalue Problem with a nonsmooth Regularizer.
Hwang, Seong Jae; Collins, Maxwell D; Ravi, Sathya N; Ithapu, Vamsi K; Adluru, Nagesh; Johnson, Sterling C; Singh, Vikas
2015-12-01
Eigenvalue problems are ubiquitous in computer vision, covering a very broad spectrum of applications ranging from estimation problems in multi-view geometry to image segmentation. Few other linear algebra problems have a more mature set of numerical routines available and many computer vision libraries leverage such tools extensively. However, the ability to call the underlying solver only as a "black box" can often become restrictive. Many 'human in the loop' settings in vision frequently exploit supervision from an expert, to the extent that the user can be considered a subroutine in the overall system. In other cases, there is additional domain knowledge, side or even partial information that one may want to incorporate within the formulation. In general, regularizing a (generalized) eigenvalue problem with such side information remains difficult. Motivated by these needs, this paper presents an optimization scheme to solve generalized eigenvalue problems (GEP) involving a (nonsmooth) regularizer. We start from an alternative formulation of GEP where the feasibility set of the model involves the Stiefel manifold. The core of this paper presents an end to end stochastic optimization scheme for the resultant problem. We show how this general algorithm enables improved statistical analysis of brain imaging data where the regularizer is derived from other 'views' of the disease pathology, involving clinical measurements and other image-derived representations.
Estimating asbestos abatement projects: Excellence or 'You said I missed what'?
International Nuclear Information System (INIS)
Frawley, R.F.
1992-01-01
Between 1900 and 1980, 30 million tons of asbestos were put in place. Because of the known health hazards and increasing federal, state, and local regulations, building owners are now facing the problem of asbestos abatement. There are 4 basic approaches to dealing with asbestos: (1) removal, (2) enclosure, (3) encapsulation, (4) deferred action in conjunction with a well defined operations and maintenance program. Once the full extent of the problem is determined, the decision can be made on which action or combination of actions to take and begin estimating the cost of the asbestos abatement project. There are no high-tech methods of asbestos removal. It is hot, wet, labor intensive work, the ways of removal are archaic. Removal means man power and man hours, labor is a big ticket item, and is an important factor in cost estimating. Become very familiar with the scope of the project and be sure to fully understand the depth of the asbestos problem. The products, supplies, tools, and in some cases, the machines are all disposable items. If one overlooks something or underestimates the time involved for removal, not only will one be way off on material, the labor costs will soar. Be very observant on walk throughs, notice everything. Be sure to get clear, accurate test results on material to be removed. Once all this is done, one can make a good take off with confidence. Finally, when in doubt always remember the 11th commandment of asbestos abatement cost-estimating 'If thou can't figure it out hor-ellipsis thou best figure it in.'
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example
Estimating Value at Risk with the Generalized Kalman Filter%基于Generalized Kalman Filter的VaR估计
Institute of Scientific and Technical Information of China (English)
赵利锋; 张崇岐
2009-01-01
在应用Kalman Filter方法估计时变风险β系数的基础上,引入Generalized Kalman Filter方法来估计时变卢系数,再通过Sharp对角线模型计算投资组合的VaR,并运用Backtesting检验判断两方法估计VaR的精确度.
ESTIMATION OF SURVIVAL IN PATIENTS WITH ADVANCED OVARIAN CANCER – ABSTRACT OF THE RESEARCH PROJECT
Directory of Open Access Journals (Sweden)
Špela Smrkolj
2018-02-01
Full Text Available Background: Morbidity and mortality caused by cancer persist to be an important health problem world- wide and in the European Union member states as well. In Slovenia, most ovarian cancer cases are detected in advanced stages, hence a rather high mortality rate. Aims: The purpose of this research project is to analyze the primary cytoreduction in the patients with advanced ovarian cancer. The main objective of the project is to assess the use of lap- aroscopy in the prediction of optimal cytoreduction in these patients. Applicative research project ‘Estimation of survial in patients with advanced ovarian can- cer based on primary laparoscopical assessment of optimal cytoreduction’ (L3-2371 was approved and has been financed by the Slovene Research Agency and co-financed by the Ministry of Health of RS; Duration: May 1, 2009–April 30, 2012. Methods: The research project will consist of retrospective and prospective study. In all the patients with advanced ovarian cancer managed at the Department of Obstetrics and Gynecol- ogy, University Medical Centre Ljubljana in the years 2003–2008, and in whom optimal primary cytoreduction was made using either laparoscopy or laparotomy, certain clinical and pathomorphological factors will be compared, and the effects of all analyzed factors on the outcome of treatment assessed. In the prospective study, we will aim at assessing the use of laparoscopy in the prediction of optimal cytoreduction in all newly detected cases using a laparoscopy-based score (Fagotti’s scoring system. Conclusions: The standard management of advanced ovarian cancer patients consists of primary surgical optimal and/or suboptimal cytoreduction followed by aggressive cytotoxic chemotherapy. In line with our experience and with that published most recently, laparoscopy seems to be a promising method with which we will attempt to most accurately assess the optimal cytoreduction in surgical treatment of ovarian cancer patients.
Navarro-Mateu, Fernando; Tormo, MJ; Vilagut, G; Alonso, J; Ruíz-Merino, G; Escámez, T; Salmerón, D; Júdez, J; Martínez, S; Navarro, C
2013-01-01
Background Multidisciplinary collaboration between clinicians, epidemiologists, neurogeneticists and statisticians on research projects has been encouraged to improve our knowledge of the complex mechanisms underlying the aetiology and burden of mental disorders. The PEGASUS-Murcia (Psychiatric Enquiry to General Population in Southeast Spain-Murcia) project was designed to assess the prevalence of common mental disorders and to identify the risk and protective factors, and it also included the collection of biological samples to study the gene–environmental interactions in the context of the World Mental Health Survey Initiative. Methods and analysis The PEGASUS-Murcia project is a new cross-sectional face-to-face interview survey based on a representative sample of non-institutionalised adults in the Region of Murcia (Mediterranean Southeast, Spain). Trained lay interviewers used the latest version of the computer-assisted personal interview of the Composite International Diagnostic Interview (CIDI 3.0) for use in Spain, specifically adapted for the project. Two biological samples of buccal mucosal epithelium will be collected from each interviewed participant, one for DNA extraction for genomic and epigenomic analyses and the other to obtain mRNA for gene expression quantification. Several quality control procedures will be implemented to assure the highest reliability and validity of the data. This article describes the rationale, sampling methods and questionnaire content as well as the laboratory methodology. Ethics and dissemination Informed consent will be obtained from all participants and a Regional Ethics Research Committee has approved the protocol. Results will be disseminated in peer-reviewed publications and presented at the national and the international conferences. Discussion Cross-sectional studies, which combine detailed personal information with biological data, offer new and exciting opportunities to study the gene
Directory of Open Access Journals (Sweden)
Ackchai Sirikijpanichkul
2015-01-01
Full Text Available For the agricultural-based countries, the requirement on transportation infrastructure should not only be limited to accommodate general traffic but also the transportation of crop and agricultural products during the harvest seasons. Most of the past researches focus on the development of truck trip estimation techniques for urban, statewide, or nationwide freight movement but neglect the importance of rural freight movement which contributes to pavement deterioration on rural roads especially during harvest seasons. Recently, the Thai Government initiated a plan to construct a network of reservoirs within the northeastern region, aiming at improving existing irrigation system particularly in the areas where a more effective irrigation system is needed. It is expected to bring in new opportunities on expanding the cultivation areas, increasing the economy of scale and enlarging the extent market of area. As a consequence, its effects on truck trip generation needed to be investigated to assure the service quality of related transportation infrastructure. This paper proposes a combinatory input-output commodity-based approach to estimate truck trips on rural highway infrastructure network. The large-scale irrigation project for the northeastern of Thailand is demonstrated as a case study.
Directory of Open Access Journals (Sweden)
Gehendra Kharel
2018-04-01
Full Text Available Background Water level fluctuations in endorheic lakes are highly susceptible to even slight changes in climate and land use. Devils Lake (DL in North Dakota, USA is an endorheic system that has undergone multi-decade flooding driven by changes in regional climate. Flooding mitigation strategies have centered on the release of lake water to a nearby river system through artificial outlets, resulting in legal challenges and environmental concerns related to water quality, downstream flooding, species migration, stakeholder opposition, and transboundary water conflicts between the US and Canada. Despite these drawbacks, running outlets would result in low overspill risks in the next 30 years. Methods In this study we evaluated the efficacy of this outlet-based mitigation strategy under scenarios based on the latest IPCC future climate projections. We used the Coupled Model Intercomparison Project CMIP-5 weather patterns from 17 general circulation models (GCMs obtained under four representative concentration pathways (RCP scenarios and downscaled to the DL region. Then, we simulated the changes in lake water levels using the soil and water assessment tool based hydrological model of the watershed. We estimated the probability of future flood risks under those scenarios and compared those with previously estimated overspill risks under the CMIP-3 climate. Results The CMIP-5 ensemble projected a mean annual temperature of 5.78 °C and mean daily precipitation of 1.42 mm/day; both are higher than the existing CMIP-3 future estimates of 4.98 °C and 1.40 mm/day, respectively. The increased precipitation and higher temperature resulted in a significant increase of DL’s overspill risks: 24.4–47.1% without release from outlets and 3.5–14.4% even if the outlets are operated at their combined full 17 m3/s capacity. Discussion The modeled increases in overspill risks indicate a greater frequency of water releases through the artificial outlets. Future
Ideal observer estimation and generalized ROC analysis for computer-aided diagnosis
International Nuclear Information System (INIS)
Edwards, Darrin C.
2004-01-01
The research presented in this dissertation represents an innovative application of computer-aided diagnosis and signal detection theory to the specific task of early detection of breast cancer in the context of screening mammography. A number of automated schemes have been developed in our laboratory to detect masses and clustered microcalcifications in digitized mammograms, on the one hand, and to classify known lesions as malignant or benign, on the other. The development of fully automated classification schemes is difficult, because the output of a detection scheme will contain false-positive detections in addition to detected malignant and benign lesions, resulting in a three-class classification task. Researchers have so far been unable to extend successful tools for analyzing two-class classification tasks, such as receiver operating characteristic (ROC) analysis, to three-class classification tasks. The goals of our research were to use Bayesian artificial neural networks to estimate ideal observer decision variables to both detect and classify clustered microcalcifications and mass lesions in mammograms, and to derive substantial theoretical results indicating potential avenues of approach toward the three-class classification task. Specifically, we have shown that an ideal observer in an N-class classification task achieves an optimal ROC hypersurface, just as the two-class ideal observer achieves an optimal ROC curve; and that an obvious generalization of a well-known two-class performance metric, the area under the ROC curve, is not useful as a performance metric in classification tasks with more than two classes. This work is significant for three reasons. First, it involves the explicit estimation of feature-based (as opposed to image-based) ideal observer decision variables in the tasks of detecting and classifying mammographic lesions. Second, it directly addresses the three-class classification task of distinguishing malignant lesions, benign
International Nuclear Information System (INIS)
Stolk, D.J.
1987-04-01
On request of the Netherlands government FEL-TNO is developing a decision support system with the acronym RAMBOS for the assessment of the off-site consequences of an accident with hazardous materials. This is a user friendly interactive computer program, which uses very sophisticated graphical means. RAMBOS supports the emergency planning organization in two ways. Firstly, the risk to the residents in the surroundings of the accident is quantified in terms of severity and magnitude (number of casualties, etc.). Secondly, the consequences of countermeasures, such as sheltering and evacuation, are predicted. By evaluating several countermeasures the user can determine an optimum policy to reduce the impact of the accident. Within the framework of the EC project 'Benchmark exercise on dose estimation in a regulatory context' on request of the Ministry of Housing, Physical Planning and Environment calculations were carried out with the RAMBOS system. This report contains the results of these calculations. 3 refs.; 2 figs.; 10 tabs
International Nuclear Information System (INIS)
Li Hongmin; Li Chunlai
2012-01-01
In this paper, we investigate two switched synchronization schemes, namely partial and complete switched generalized function projective synchronization, by using the adaptive control method. Partial switched synchronization of chaotic systems means that the state variables of the drive system synchronize with partial different state variables of the response system, whereas complete switched synchronization of chaotic systems means that all the state variables of the drive system synchronize with complete different state variables of the response system. Because the switched synchronization scheme exists in many combinations, it is a promising type of synchronization as it provides greater security in secure communications. Based on the Lyapunov stability theory, the adaptive control laws and the parameter update laws are derived to make the states of two identical/different hyperchaotic systems asymptotically synchronized up to a desired scaling function. Finally, numerical simulations are performed to verify and illustrate the analytical results.
International Nuclear Information System (INIS)
Yau, H.-T.
2008-01-01
This Letter presents a robust control scheme to generalized projective synchronization between two identical two-degrees-of-freedom heavy symmetric gyroscopes with dead zone nonlinear inputs. Because of the nonlinear terms of the gyroscope system, the system exhibits complex and chaotic motions. By the Lyapunov stability theory with control terms, two suitable sliding surfaces are proposed to ensure the stability of the controlled closed-loop system in sliding mode. Then, two sliding mode controllers (SMC) are designed to guarantee the hitting of the sliding surfaces even when the control inputs contain dead-zone nonlinearity. This method allows us to arbitrarily direct the scaling factor onto a desired value. Numerical simulations show that this method works very well for the proposed controller
Directory of Open Access Journals (Sweden)
Wu Chi-Yeh
2010-01-01
Full Text Available Abstract Background MicroRNAs (miRNAs are short non-coding RNA molecules, which play an important role in post-transcriptional regulation of gene expression. There have been many efforts to discover miRNA precursors (pre-miRNAs over the years. Recently, ab initio approaches have attracted more attention because they do not depend on homology information and provide broader applications than comparative approaches. Kernel based classifiers such as support vector machine (SVM are extensively adopted in these ab initio approaches due to the prediction performance they achieved. On the other hand, logic based classifiers such as decision tree, of which the constructed model is interpretable, have attracted less attention. Results This article reports the design of a predictor of pre-miRNAs with a novel kernel based classifier named the generalized Gaussian density estimator (G2DE based classifier. The G2DE is a kernel based algorithm designed to provide interpretability by utilizing a few but representative kernels for constructing the classification model. The performance of the proposed predictor has been evaluated with 692 human pre-miRNAs and has been compared with two kernel based and two logic based classifiers. The experimental results show that the proposed predictor is capable of achieving prediction performance comparable to those delivered by the prevailing kernel based classification algorithms, while providing the user with an overall picture of the distribution of the data set. Conclusion Software predictors that identify pre-miRNAs in genomic sequences have been exploited by biologists to facilitate molecular biology research in recent years. The G2DE employed in this study can deliver prediction accuracy comparable with the state-of-the-art kernel based machine learning algorithms. Furthermore, biologists can obtain valuable insights about the different characteristics of the sequences of pre-miRNAs with the models generated by the G
Rivera, Diego; Rivas, Yessica; Godoy, Alex
2015-02-01
Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.
Developing a general practice library: a collaborative project between a GP and librarian.
Pearson, D; Rossall, H
2001-12-01
The authors report on a self-completed questionnaire study from a North Yorkshire based general practice regarding the information needs of its clinicians. The work was carried out with a particular focus on the practice library, and the findings identified that a new approach to maintaining and developing the library was needed. The literature regarding the information needs of primary care clinicians and the role of practice libraries is considered, and compared to those of the clinicians at the practice. Discussion follows on how a collaborative project was set up between the practice and a librarian based at the local NHS Trust library in order to improve the existing practice library. Difficulties encountered and issues unique to the project are explored, including training implications presented by the implementation of electronic resources. Marketing activities implemented are discussed, how the library will operate in its new capacity, and how ongoing support and maintenance of the library will be carried out. It is concluded that although scepticism still exists regarding librarian involvement in practice libraries, collaboration between clinicians and librarians is an effective approach to the successful development and maintenance of a practice library, and recommendations are therefore made for similar collaborative work.
APhoRISM FP7 project: the Multi-platform volcanic Ash Cloud Estimation (MACE) infrastructure
Merucci, Luca; Corradini, Stefano; Bignami, Christian; Stramondo, Salvatore
2014-05-01
APHORISM is an FP7 project that aims to develop innovative products to support the management and mitigation of the volcanic and the seismic crisis. Satellite and ground measurements will be managed in a novel manner to provide new and improved products in terms of accuracy and quality of information. The Multi-platform volcanic Ash Cloud Estimation (MACE) infrastructure will exploit the complementarity between geostationary, and polar satellite sensors and ground measurements to improve the ash detection and retrieval and to fully characterize the volcanic ash clouds from source to the atmosphere. The basic idea behind the proposed method consists to manage in a novel manner, the volcanic ash retrievals at the space-time scale of typical geostationary observations using both the polar satellite estimations and in-situ measurements. The typical ash thermal infrared (TIR) retrieval will be integrated by using a wider spectral range from visible (VIS) to microwave (MW) and the ash detection will be extended also in case of cloudy atmosphere or steam plumes. All the MACE ash products will be tested on three recent eruptions representative of different eruption styles in different clear or cloudy atmospheric conditions: Eyjafjallajokull (Iceland) 2010, Grimsvotn (Iceland) 2011 and Etna (Italy) 2011-2012. The MACE infrastructure will be suitable to be implemented in the next generation of ESA Sentinels satellite missions.
THE LICK AGN MONITORING PROJECT: RECALIBRATING SINGLE-EPOCH VIRIAL BLACK HOLE MASS ESTIMATES
Energy Technology Data Exchange (ETDEWEB)
Park, Daeseong; Woo, Jong-Hak [Astronomy Program, Department of Physics and Astronomy, Seoul National University, Seoul 151-742 (Korea, Republic of); Treu, Tommaso; Bennert, Vardha N. [Department of Physics, University of California, Santa Barbara, CA 93106 (United States); Barth, Aaron J.; Walsh, Jonelle [Department of Physics and Astronomy, 4129 Frederick Reines Hall, University of California, Irvine, CA 92697-4575 (United States); Bentz, Misty C. [Department of Physics and Astronomy, Georgia State University Atlanta, GA 30303 (United States); Canalizo, Gabriela [Department of Physics and Astronomy, University of California, Riverside, 900 University Ave., Riverside, CA 92521 (United States); Filippenko, Alexei V. [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Gates, Elinor [Lick Observatory, P.O. Box 85, Mount Hamilton, CA 95140 (United States); Greene, Jenny E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Malkan, Matthew A., E-mail: woo@astro.snu.ac.kr [Department of Physics and Astronomy, University of California, Los Angeles, CA 90024 (United States)
2012-03-01
We investigate the calibration and uncertainties of black hole (BH) mass estimates based on the single-epoch (SE) method, using homogeneous and high-quality multi-epoch spectra obtained by the Lick Active Galactic Nucleus (AGN) Monitoring Project for nine local Seyfert 1 galaxies with BH masses <10{sup 8} M{sub Sun }. By decomposing the spectra into their AGNs and stellar components, we study the variability of the SE H{beta} line width (full width at half-maximum intensity, FWHM{sub H{beta}} or dispersion, {sigma}{sub H{beta}}) and of the AGN continuum luminosity at 5100 A (L{sub 5100}). From the distribution of the 'virial products' ({proportional_to} FWHM{sub H{beta}}{sup 2} L{sup 0.5}{sub 5100} or {sigma}{sub H{beta}}{sup 2} L{sup 0.5}{sub 5100}) measured from SE spectra, we estimate the uncertainty due to the combined variability as {approx}0.05 dex (12%). This is subdominant with respect to the total uncertainty in SE mass estimates, which is dominated by uncertainties in the size-luminosity relation and virial coefficient, and is estimated to be {approx}0.46 dex (factor of {approx}3). By comparing the H{beta} line profile of the SE, mean, and root-mean-square (rms) spectra, we find that the H{beta} line is broader in the mean (and SE) spectra than in the rms spectra by {approx}0.1 dex (25%) for our sample with FWHM{sub H{beta}} <3000 km s{sup -1}. This result is at variance with larger mass BHs where the difference is typically found to be much less than 0.1 dex. To correct for this systematic difference of the H{beta} line profile, we introduce a line-width dependent virial factor, resulting in a recalibration of SE BH mass estimators for low-mass AGNs.
Han, Fang; Liu, Han
2016-01-01
Correlation matrices play a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, it is not an effective estimator when facing heavy-tailed distributions. As a robust alternative, Han and Liu [J. Am. Stat. Assoc. 109 (2015) 275-2...
Energy Technology Data Exchange (ETDEWEB)
Ju, Lili; Tian, Li; Wang, Desheng
2008-10-31
In this paper, we present a residual-based a posteriori error estimate for the finite volume discretization of steady convection– diffusion–reaction equations defined on surfaces in R3, which are often implicitly represented as level sets of smooth functions. Reliability and efficiency of the proposed a posteriori error estimator are rigorously proved. Numerical experiments are also conducted to verify the theoretical results and demonstrate the robustness of the error estimator.
2009-01-01
Abstract A kernel estimator of the conditional quantile is defined for a scalar response variable given a covariate taking values in a semi-metric space. The approach generalizes the median?s L1-norm estimator. The almost complete consistency and asymptotic normality are stated. correspondance: Corresponding author. Tel: +33 320 964 933; fax: +33 320 964 704. (Lemdani, Mohamed) (Laksaci, Ali) mohamed.lemdani@univ-lill...
Poggenburg, Stephanie; Reinisch, Manuel; Höfler, Reinhild; Stigler, Florian; Avian, Alexander; Siebenhofer, Andrea
2017-11-01
Increasing recognition of general practice is reflected in the growing number of university institutes devoted to the subject and Health Services Research (HSR) is flourishing as a result. In May 2015 the Institute of General Practice and Evidence-based Health Services Research, Medical University of Graz, initiated a survey of Styrian GPs. The aim of the survey was to determine the willingness to take part in HSR projects, to collect sociodemographic data from GPs who were interested and to identify factors affecting participation in research projects. Of the 1015 GPs who received the questionnaire, 142 (14%) responded and 135 (13%) were included in the analysis. Overall 106 (10%) GPs indicated their willingness to take part in research projects. Factors inhibiting participation were lack of time, administrative workload, and lack of assistance. Overall, 10% of Styrian GPs were willing to participate in research projects. Knowledge about the circumstances under which family doctors are prepared to participate in HSR projects will help in the planning of future projects.
Energy Technology Data Exchange (ETDEWEB)
Amini, Nina H. [Stanford University, Edward L. Ginzton Laboratory, Stanford, CA (United States); CNRS, Laboratoire des Signaux et Systemes (L2S) CentraleSupelec, Gif-sur-Yvette (France); Miao, Zibo; Pan, Yu; James, Matthew R. [Australian National University, ARC Centre for Quantum Computation and Communication Technology, Research School of Engineering, Canberra, ACT (Australia); Mabuchi, Hideo [Stanford University, Edward L. Ginzton Laboratory, Stanford, CA (United States)
2015-12-15
The purpose of this paper is to study the problem of generalizing the Belavkin-Kalman filter to the case where the classical measurement signal is replaced by a fully quantum non-commutative output signal. We formulate a least mean squares estimation problem that involves a non-commutative system as the filter processing the non-commutative output signal. We solve this estimation problem within the framework of non-commutative probability. Also, we find the necessary and sufficient conditions which make these non-commutative estimators physically realizable. These conditions are restrictive in practice. (orig.)
LENUS (Irish Health Repository)
Jonker, W R
2014-06-29
As part of the 5th National Audit Project of the Royal College of Anaesthetists and the Association of Anaesthetists of Great Britain and Ireland concerning accidental awareness during general anaesthesia, we issued a questionnaire to every consultant anaesthetist in each of 46 public hospitals in Ireland, represented by 41 local co-ordinators. The survey ascertained the number of new cases of accidental awareness becoming known to them for patients under their care or supervision for a calendar year, as well as their career experience. Consultants from all hospitals responded, with an individual response rate of 87% (299 anaesthetists). There were eight new cases of accidental awareness that became known to consultants in 2011; an estimated incidence of 1:23 366. Two out of the eight cases (25%) occurred at or after induction of anaesthesia, but before surgery; four cases (50%) occurred during surgery; and two cases (25%) occurred after surgery was complete, but before full emergence. Four cases were associated with pain or distress (50%), one after an experience at induction and three after experiences during surgery. There were no formal complaints or legal actions that arose in 2011 related to awareness. Depth of anaesthesia monitoring was reported to be available in 33 (80%) departments, and was used by 184 consultants (62%), 18 (6%) routinely. None of the 46 hospitals had a policy to prevent or manage awareness. Similar to the results of a larger survey in the UK, the disparity between the incidence of awareness as known to anaesthetists and that reported in trials warrants explanation. Compared with UK practice, there appears to be greater use of depth of anaesthesia monitoring in Ireland, although this is still infrequent.
Duchêne, Sebastián; Geoghegan, Jemma L; Holmes, Edward C; Ho, Simon Y W
2016-11-15
In rapidly evolving pathogens, including viruses and some bacteria, genetic change can accumulate over short time-frames. Accordingly, their sampling times can be used to calibrate molecular clocks, allowing estimation of evolutionary rates. Methods for estimating rates from time-structured data vary in how they treat phylogenetic uncertainty and rate variation among lineages. We compiled 81 virus data sets and estimated nucleotide substitution rates using root-to-tip regression, least-squares dating and Bayesian inference. Although estimates from these three methods were often congruent, this largely relied on the choice of clock model. In particular, relaxed-clock models tended to produce higher rate estimates than methods that assume constant rates. Discrepancies in rate estimates were also associated with high among-lineage rate variation, and phylogenetic and temporal clustering. These results provide insights into the factors that affect the reliability of rate estimates from time-structured sequence data, emphasizing the importance of clock-model testing. sduchene@unimelb.edu.au or garzonsebastian@hotmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
International Nuclear Information System (INIS)
Passell, H.D.; Barber, D.S.; Kadyrzhanov, K.K; Solodukhin, V.P.; Chernykh, E.E.; Arutyunyan, R.V.; Valyaev, A.N.; Kadik, A.A.; Stepanetts, O.V; Vernadsky, V.I.; Alizade, A.A.; Gutiev, I.S.; Mamedov, R.F.; Nadareishvili, K.S.; Chkhartishvili, A.G.; Tsitskishvili, M.S.; Chubaryan, E. V; Gevorgyan, R.G.; Puskyulyan, K.I.
2005-01-01
Full text: The scientific community of six countries (USA, Kazakhstan, Russia, Georgia, Armenia, and Azerbaijan) has developed the ecological project 'Joint International Researches and Creation of the General System of Radiation and Hydro-chemical Monitoring of Rivers of the Caspian Sea Basin.' The purpose of this project is to investigate and characterized contamination by radionuclides and toxic and chemically hazardous elements and create a valid system of radiation and hydro-chemical monitoring of the main river basins of the Caspian region: the Volga, Ural, Emba and Kura. The basins of these rivers cover large parts of Europe and Asia, including parts of Russia, West Kazakhstan and the South Caucasus and including territories of Georgia, Armenia and Azerbaijan. The total area of the basins of these rivers exceeds combined area of such large European states as France, Spain and Germany, and comprises 1631 sq. km. All these rivers are the main life-supporting water arteries for the region's inhabitants, a population that comprises tens of millions of people Also, the outlets of these rivers determine the condition of the aquatic environment of the Caspian Sea. The ecological condition of the aquatic environment of all the rivers of the Caspian Sea is to a great extent due to contamination, the main components of which are anthropogenic radioactive elements, heavy metals and oil products. At present, information about contamination levels of these rivers is not known and is occasionally contradictory. In this connection there is an obvious need of qualified investigation of contamination levels and its character in the basins of these rivers and in creating a common monitoring system to assess the quality of the aquatic environment. The present project is devoted to this matter. 1. The project provides for the following main tasks: 1. Wide-ranging investigation of the level and character of contamination by radionuclides and toxic elements of the river basins
International Nuclear Information System (INIS)
Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W
2015-01-01
Purpose: A synchronized moving grid (SMOG) has been proposed to reduce scatter and lag artifacts in cone beam computed tomography (CBCT). However, information is missing in each projection because certain areas are blocked by the grid. A previous solution to this issue is acquiring 2 complimentary projections at each position, which increases scanning time. This study reports our first Result using an inter-projection sensor fusion (IPSF) method to estimate missing projection in our prototype SMOG-based CBCT system. Methods: An in-house SMOG assembling with a 1:1 grid of 3 mm gap has been installed in a CBCT benchtop. The grid moves back and forth in a 3-mm amplitude and up-to 20-Hz frequency. A control program in LabView synchronizes the grid motion with the platform rotation and x-ray firing so that the grid patterns for any two neighboring projections are complimentary. A Catphan was scanned with 360 projections. After scatter correction, the IPSF algorithm was applied to estimate missing signal for each projection using the information from the 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was applied to reconstruct CBCT images. The CBCTs were compared to those reconstructed using normal projections without applying the SMOG system. Results: The SMOG-IPSF method may reduce image dose by half due to the blocked radiation by the grid. The method almost completely removed scatter related artifacts, such as the cupping artifacts. The evaluation of line pair patterns in the CatPhan suggested that the spatial resolution degradation was minimal. Conclusion: The SMOG-IPSF is promising in reducing scatter artifacts and improving image quality while reducing radiation dose
Energy Technology Data Exchange (ETDEWEB)
Zhang, H; Kong, V; Jin, J [Georgia Regents University Cancer Center, Augusta, GA (Georgia); Ren, L; Zhang, Y; Giles, W [Duke University Medical Center, Durham, NC (United States)
2015-06-15
Purpose: A synchronized moving grid (SMOG) has been proposed to reduce scatter and lag artifacts in cone beam computed tomography (CBCT). However, information is missing in each projection because certain areas are blocked by the grid. A previous solution to this issue is acquiring 2 complimentary projections at each position, which increases scanning time. This study reports our first Result using an inter-projection sensor fusion (IPSF) method to estimate missing projection in our prototype SMOG-based CBCT system. Methods: An in-house SMOG assembling with a 1:1 grid of 3 mm gap has been installed in a CBCT benchtop. The grid moves back and forth in a 3-mm amplitude and up-to 20-Hz frequency. A control program in LabView synchronizes the grid motion with the platform rotation and x-ray firing so that the grid patterns for any two neighboring projections are complimentary. A Catphan was scanned with 360 projections. After scatter correction, the IPSF algorithm was applied to estimate missing signal for each projection using the information from the 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was applied to reconstruct CBCT images. The CBCTs were compared to those reconstructed using normal projections without applying the SMOG system. Results: The SMOG-IPSF method may reduce image dose by half due to the blocked radiation by the grid. The method almost completely removed scatter related artifacts, such as the cupping artifacts. The evaluation of line pair patterns in the CatPhan suggested that the spatial resolution degradation was minimal. Conclusion: The SMOG-IPSF is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.
Czech Academy of Sciences Publication Activity Database
Pekár, S.; Brabec, Marek
2018-01-01
Roč. 124, č. 2 (2018), s. 86-93 ISSN 0179-1613 Institutional support: RVO:67985807 Keywords : correlated data * generalized estimating equations * marginal model * regression models * statistical analysis Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.398, year: 2016
Kahsay, T.N.; Kuik, O.J.; Brouwer, R.; van der Zaag, P.
2015-01-01
Employing a multi-region multi-sector computable general equilibrium (CGE) modeling framework, this study estimates the direct and indirect economic impacts of the Grand Ethiopian Renaissance Dam (GERD) on the Eastern Nile economies. The study contributes to the existing literature by evaluating the
Estimates of general and emotional intelligence for self and parents in Iran.
Yousefi, Farideh
2009-08-01
Estimations of IQ and emotional intelligence for self and parents were investigated. Previous studies in both Western and African cultures have found significant sex differences in self-estimates of IQ and emotional intelligence, while IQ was rated higher for fathers than mothers. These prior results suggest the findings should be invariant across culture, and were expected to be replicated here in a predominantly Islamic society with great sociopolitical changes with respect to the Islamic Revolution. 187 Iranian university students estimated their own and their parents' scores on IQ and 15 facets of emotional intelligence on a normal distribution graph. The present results showed no significant sex differences in self-estimates of these variables, while fathers were rated higher on IQ. The implications of these findings are offered in light of sociopolitical changes during the last three decades in Iran.
Estimation of CANDU reactor zone controller level by generalized perturbation method
International Nuclear Information System (INIS)
Kim, Do Heon; Kim, Jong Kyung; Choi, Hang Bok; Roh, Gyu Hong; Yang, Won Sik
1998-01-01
The zone controller level change due to refueling operation has been studied using a generalized perturbation method. The generalized perturbation method provides sensitivity of zone power to individual refueling operation and incremental change of zone controller level. By constructing a system equation for each zone power, the zone controller level change was obtained. The details and a proposed model for future work are described
Applications of Small Area Estimation to Generalization with Subclassification by Propensity Scores
Chan, Wendy
2018-01-01
Policymakers have grown increasingly interested in how experimental results may generalize to a larger population. However, recently developed propensity score-based methods are limited by small sample sizes, where the experimental study is generalized to a population that is at least 20 times larger. This is particularly problematic for methods…
DEFF Research Database (Denmark)
Jacobsen, Martin; Martinussen, Torben
2016-01-01
Pseudo-values have proven very useful in censored data analysis in complex settings such as multi-state models. It was originally suggested by Andersen et al., Biometrika, 90, 2003, 335 who also suggested to estimate standard errors using classical generalized estimating equation results. These r......Pseudo-values have proven very useful in censored data analysis in complex settings such as multi-state models. It was originally suggested by Andersen et al., Biometrika, 90, 2003, 335 who also suggested to estimate standard errors using classical generalized estimating equation results....... These results were studied more formally in Graw et al., Lifetime Data Anal., 15, 2009, 241 that derived some key results based on a second-order von Mises expansion. However, results concerning large sample properties of estimates based on regression models for pseudo-values still seem unclear. In this paper......, we study these large sample properties in the simple setting of survival probabilities and show that the estimating function can be written as a U-statistic of second order giving rise to an additional term that does not vanish asymptotically. We further show that previously advocated standard error...
Abejuela, Harmony Raylen; Osser, David N
2016-01-01
This revision of previous algorithms for the pharmacotherapy of generalized anxiety disorder was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. Algorithms from 1999 and 2010 and associated references were reevaluated. Newer studies and reviews published from 2008-14 were obtained from PubMed and analyzed with a focus on their potential to justify changes in the recommendations. Exceptions to the main algorithm for special patient populations, such as women of childbearing potential, pregnant women, the elderly, and those with common medical and psychiatric comorbidities, were considered. Selective serotonin reuptake inhibitors (SSRIs) are still the basic first-line medication. Early alternatives include duloxetine, buspirone, hydroxyzine, pregabalin, or bupropion, in that order. If response is inadequate, then the second recommendation is to try a different SSRI. Additional alternatives now include benzodiazepines, venlafaxine, kava, and agomelatine. If the response to the second SSRI is unsatisfactory, then the recommendation is to try a serotonin-norepinephrine reuptake inhibitor (SNRI). Other alternatives to SSRIs and SNRIs for treatment-resistant or treatment-intolerant patients include tricyclic antidepressants, second-generation antipsychotics, and valproate. This revision of the GAD algorithm responds to issues raised by new treatments under development (such as pregabalin) and organizes the evidence systematically for practical clinical application.
General-purpose heat source project and space nuclear safety and fuels program. Progress report
International Nuclear Information System (INIS)
Maraman, W.J.
1980-02-01
Studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of LASL are presented. The three programs involved are: general-purpose heat source development; space nuclear safety; and fuels program. Three impact tests were conducted to evaluate the effects of a high temperature reentry pulse and the use of CBCF on impact performance. Additionally, two 238 PuO 2 pellets were encapsulated in Ir-0.3% W for impact testing. Results of the clad development test and vent testing are noted. Results of the environmental tests are summarized. Progress on the Stirling isotope power systems test and the status of the improved MHW tests are indicated. The examination of the impact failure of the iridium shell of MHFT-65 at a fuel pass-through continued. A test plan was written for vibration testing of the assembled light-weight radioisotopic heater unit. Progress on fuel processing is reported
McNab, Duncan; McKay, John; Bowie, Paul
2015-11-01
Small-scale quality improvement projects are expected to make a significant contribution towards improving the quality of healthcare. Enabling doctors-in-training to design and lead quality improvement projects is important preparation for independent practice. Participation is mandatory in speciality training curricula. However, provision of training and ongoing support in quality improvement methods and practice is variable. We aimed to design and deliver a quality improvement training package to core medical and general practice specialty trainees and evaluate impact in terms of project participation, completion and publication in a healthcare journal. A quality improvement training package was developed and delivered to core medical trainees and general practice specialty trainees in the west of Scotland encompassing a 1-day workshop and mentoring during completion of a quality improvement project over 3 months. A mixed methods evaluation was undertaken and data collected via questionnaire surveys, knowledge assessment, and formative assessment of project proposals, completed quality improvement projects and publication success. Twenty-three participants attended the training day with 20 submitting a project proposal (87%). Ten completed quality improvement projects (43%), eight were judged as satisfactory (35%), and four were submitted and accepted for journal publication (17%). Knowledge and confidence in aspects of quality improvement improved during the pilot, while early feedback on project proposals was valued (85.7%). This small study reports modest success in training core medical trainees and general practice specialty trainees in quality improvement. Many gained knowledge of, confidence in and experience of quality improvement, while journal publication was shown to be possible. The development of educational resources to aid quality improvement project completion and mentoring support is necessary if expectations for quality improvement are to be
International Nuclear Information System (INIS)
Rautman, C.A.
1991-02-01
The spatial correlation structure of volcanic tuffs at and near the site of the proposed high-level nuclear waste repository at Yucca Mountain, Nevada, is estimated using samples obtained from surface outcrops and drill holes. Data are examined for four rock properties: porosity, air permeability, saturated hydraulic conductivity, and dry bulk density. Spatial continuity patterns are identified in both lateral and vertical (stratigraphic) dimensions. The data are examined for the Calico Hills tuff stratigraphic unit and also without regard for stratigraphy. Variogram models fitted to the sample data from the tuffs of Calico Hills indicate that porosity is correlated laterally over distances of up to 3000 feet. If air permeability and saturated conductivity values are viewed as semi-interchangeable for purposes of identifying spatial structure, the data suggest a maximum range of correlation of 300 to 500 feet without any obvious horizontal to vertical anisotropy. Continuity exists over vertical distances of roughly 200 feet. Similar variogram models fitted to sample data taken from vertical drill holes without regard for stratigraphy suggest that correlation exists over distances of 500 to 800 feet for each rock property examined. Spatial correlation of rock properties violates the sample-independence assumptions of classical statistics to a degree not usually acknowledged. In effect, the existence of spatial structure reduces the ''equivalent'' number of samples below the number of physical samples. This reduction in the effective sampling density has important implications for site characterization for the Yucca Mountain Project. 19 refs., 43 figs., 5 tabs
Final report on the project research 'stochastic effects of irradiation and risk estimation'
International Nuclear Information System (INIS)
1989-03-01
The title project research was carried out through 1983-1987, by three groups for the studies of radiation carcinogenesis, human genetic effects, and radiotoxicology. 8 reports by the first group, 3 by the second, and 6 by the third group are collected in this issue. The contents are as follows. Serial sacrifice study on tumorigenesis in male C57BL/6J mice exposed to gamma-ray or fast neutron radiation; Influence of biological variables on radiation carcinogenesis; Studies on radiation-induced thymic lymphomagenesis; Modifying factors of radiation induced myeloid leukemia of C3H/He mouse; Cell kinetic studies on radiation induced leukemogenesis; Cytogenetical studies on the mechanism of radiation induced neoplasmas; Molecular biological study on genetic stability of the genome; Protein factors regulating proliferation and differentiation of normal and neoplastic cells; Studies on dose-radiation relationships for induction of chromosome abberations in stem-spermatogonia of three crab-eating monkey after low and high dose rate γ-irradiation; Risk estimation of radiation mutagenesis in man by using cultured mammalian cells; Effects of ionizing radiation on male germ cells of crabeating monkey; Movement and metabolism of radioactive particles in the respiratory tract; Studies on dosimetry for internally deposited alpha-emitters; Comparative toxicological studies on the effects of internal exposures; Studies on treatment of alpha-radioactive wastes; Methodological studies on the inhalation of radioactive aerosols; Removal of transuranic elements by DTPA. (A. Y.)
Estimates for a general fractional relaxation equation and application to an inverse source problem
Bazhlekova, Emilia
2018-01-01
A general fractional relaxation equation is considered with a convolutional derivative in time introduced by A. Kochubei (Integr. Equ. Oper. Theory 71 (2011), 583-600). This equation generalizes the single-term, multi-term and distributed-order fractional relaxation equations. The fundamental and the impulse-response solutions are studied in detail. Properties such as analyticity and subordination identities are established and employed in the proof of an upper and a lower bound. The obtained...
Rural Fuel-wood and Poles Research Project in Malawi: a general account
Energy Technology Data Exchange (ETDEWEB)
Nkaonja, R S.W.
1981-01-01
The Rural Fuel-wood and Poles Research Project was initiated to provide information about afforestation in the dry silvicultural zones. Plantation forestry in Malawi has concentrated on production of timber, poles, and pulpwood. It is estimated that 90% of Malawi's population of 5.5 million live in rural communities, and that the purely domestic wood requirement is 4.05 cubic m per family (of five) annually. In addition, wood is required for agricultural purposes such as tobacco curing. The remaining indigenous forest cannot meet the demand. There is an urgent need for plantations. Rather than simply planting trees, the aim is to make local communities self-sufficient in forest products. In view of the shortage of land, great emphasis is placed on trying species which have many end-uses-- e.g., poles, fuel-wood, mulch, fodder, and shade--and those which can be grown together with farm crops, a concept known as ''agroforestry.'' Over 20 ha of trials were established at locations in the three regions of the country. Acacia albida allows maize and other farm crops to grow under it, provides good shade and fodder, and--as legume--enriches the soil with nitrogen. Eucalypts were included because most produce straight poles for construction, are drought-hardy, and are rated higher than Gmelina arborea in calorific value, durability, and strength. Another tree favored for its multiple uses is Leucaena leucocephala (Hawaiian giant), but it appears that there is considerable mixture of varieties in the seeds. With the exception of one trial at Bwanje, trials have not included farm crops, but the agroforestry element will be a very important consideration in future trials.
International Nuclear Information System (INIS)
Edwards, R.G.; Durfee, R.C.
1976-09-01
The GRIDOT computer program draws overlay grids on a Calcomp plotter for use in digitizing information from maps, rectified aerial photographs, and other sources of spatially distributed data related to regional environmental problems. The options of the program facilitate use of the overlays with standard maps and map projections of the continental United States. The overlay grid may be defined as a latitude-longitude grid (geodetic grid), a Universal Transverse Mercator Grid, or one of the standard state-plane coordinate system grids. The map for which the overlay is intended may be in an Albers Equal Area projection, a Lambert Conformal projection, a Polyconic projection, a Transverse Mercator projection, a Universal Transverse Mercator projection, or any of the standard state-plane projections
Directory of Open Access Journals (Sweden)
Carlos E. Galván-Tejada
2018-02-01
Full Text Available The indoor location of individuals is a key contextual variable for commercial and assisted location-based services and applications. Commercial centers and medical buildings (e.g., hospitals require location information of their users/patients to offer the services that are needed at the correct moment. Several approaches have been proposed to tackle this problem. In this paper, we present the development of an indoor location system which relies on the human activity recognition approach, using sound as an information source to infer the indoor location based on the contextual information of the activity that is realized at the moment. In this work, we analyze the sound information to estimate the location using the contextual information of the activity. A feature extraction approach to the sound signal is performed to feed a random forest algorithm in order to generate a model to estimate the location of the user. We evaluate the quality of the resulting model in terms of sensitivity and specificity for each location, and we also perform out-of-bag error estimation. Our experiments were carried out in five representative residential homes. Each home had four individual indoor rooms. Eleven activities (brewing coffee, cooking, eggs, taking a shower, etc. were performed to provide the contextual information. Experimental results show that developing an indoor location system (ILS that uses contextual information from human activities (identified with data provided from the environmental sound can achieve an estimation that is 95% correct.
DEFF Research Database (Denmark)
Varneskov, Rasmus T.
2014-01-01
-top estimators are shown to be consistent, asymptotically unbiased, and mixed Gaussian at the optimal rate of convergence, n1/4. Exact bound on lower order terms are obtained using maximal inequalities and these are used to derive a conservative, MSE-optimal flat-top shrinkage. Additionally, bounds...
Monte Carlo Maximum Likelihood Estimation for Generalized Long-Memory Time Series Models
Mesters, G.; Koopman, S.J.; Ooms, M.
2016-01-01
An exact maximum likelihood method is developed for the estimation of parameters in a non-Gaussian nonlinear density function that depends on a latent Gaussian dynamic process with long-memory properties. Our method relies on the method of importance sampling and on a linear Gaussian approximating
Directory of Open Access Journals (Sweden)
Cécile Souty
2016-11-01
Full Text Available Abstract Background In surveillance networks based on voluntary participation of health-care professionals, there is little choice regarding the selection of participants’ characteristics. External information about participants, for example local physician density, can help reduce bias in incidence estimates reported by the surveillance network. Methods There is an inverse association between the number of reported influenza-like illness (ILI cases and local general practitioners (GP density. We formulated and compared estimates of ILI incidence using this relationship. To compare estimates, we simulated epidemics using a spatially explicit disease model and their observation by surveillance networks with different characteristics: random, maximum coverage, largest cities, etc. Results In the French practice-based surveillance network – the “Sentinelles” network – GPs reported 3.6% (95% CI [3;4] less ILI cases as local GP density increased by 1 GP per 10,000 inhabitants. Incidence estimates varied markedly depending on scenarios for participant selection in surveillance. Yet accounting for change in GP density for participants allowed reducing bias. Applied on data from the Sentinelles network, changes in overall incidence ranged between 1.6 and 9.9%. Conclusions Local GP density is a simple measure that provides a way to reduce bias in estimating disease incidence in general practice. It can contribute to improving disease monitoring when it is not possible to choose the characteristics of participants.
2010-07-01
... 32 National Defense 1 2010-07-01 2010-07-01 false Must I be able to estimate project expenditures... I be able to estimate project expenditures precisely in order to justify use of a fixed-support TIA... purposes of this illustration, let that minimum recipient cost sharing be 40% of the total project costs...
Buffalano, C.; Fogleman, S.; Gielecki, M.
1976-01-01
A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.
Anzehaee, Mohammad Mousavi; Haeri, Mohammad
2011-07-01
New estimators are designed based on the modified force balance model to estimate the detaching droplet size, detached droplet size, and mean value of droplet detachment frequency in a gas metal arc welding process. The proper droplet size for the process to be in the projected spray transfer mode is determined based on the modified force balance model and the designed estimators. Finally, the droplet size and the melting rate are controlled using two proportional-integral (PI) controllers to achieve high weld quality by retaining the transfer mode and generating appropriate signals as inputs of the weld geometry control loop. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Olga Flores Uzeta
2008-09-01
Full Text Available The main goal of this study is to provide estimations of mean mortality rate of vegetative shoots of the seagrass Zostera marina in a meadow near Ensenada Baja California, using a technique that minimizes destructive sampling. Using cohorts and Leslie matrices, three life tables were constructed, each representing a season within the period of monthly sampling (April 1999 to April 2000. Ages for the cohorts were established in terms of Plastochrone Interval (PI. The matrices were projected through time to estimate the mean total number of individuals at time t, n(t as well as mortality. We found no statistical differences between observed and predicted mean values for these variables (t=-0.11, p=0.92 for n(t and t=0.69, p=0.5 for mean rate of mortality. We found high correlation coefficient values between observed and projected values for monthly number of individuals (r=0.70, p=0.007 and monthly mortality rates (r=0.81, p=0.001. If at a certain time t a sudden environmental change occurs, and as long as the perturbation does not provoke the killing of all the individuals of a given age i for 0 ≤ i ≤ x - 1, there will be a prevailing number of individuals of age or stage x at a time t+1. This nondestructive technique reduces the number of field visits and samples needed for the demographic analysis of Z. marina, and therefore decreases the disturbance caused by researches to the ecosystem. Rev. Biol. Trop. 56 (3: 1015-1022. Epub 2008 September 30El propósito principal de este estudio es el de proveer estimaciones de tasas promedio de mortalidad de tallos vegetativos de Zostera marina en una pradera cercana a Ensenada Baja California, utilizando una técnica que minimiza los muestreos destructivos para estos pastos marinos. Mediante la utilización de cohortes y matrices de Leslie, se construyeron tres tablas de vida, cada una representando a una estación dentro de período anual de muestreos mensuales (Abril 1999 a Abril 2000. Las edades
He, Wu
2014-01-01
Currently, a work breakdown structure (WBS) approach is used as the most common cost estimation approach for online course production projects. To improve the practice of cost estimation, this paper proposes a novel framework to estimate the cost for online course production projects using a case-based reasoning (CBR) technique and a WBS. A…
A Generalized Schwartz Model for Energy Spot Prices - Estimation using a Particle MCMC Method
DEFF Research Database (Denmark)
Lunde, Asger; Brix, Anne Floor; Wei, Wei
structure. Instead of using various filtering techniques for splitting the two factors, as often found in the literature, we estimate the model in one step using an adaptive MCMC method with a Rao-Blackwellized particle filter. We fit the model to UK natural gas spot prices and investigate the importance......We propose an energy spot price model featuring a two-factor price process and a two-component stochastic volatility process. The first factor in the price process captures the normal variations; the second accounts for spikes. The two-component volatility allows for a flexible autocorrelation...... of spikes and stochastic volatility. We find that the inclusion of stochastic volatility is crucial and that it strongly impacts the jump intensity in the spike process. Furthermore, our estimation method enables us to consider both continuous and purely jump-driven volatility processes, and thereby assess...
Cho, Yumi
2018-05-01
We study nonlinear elliptic problems with nonstandard growth and ellipticity related to an N-function. We establish global Calderón-Zygmund estimates of the weak solutions in the framework of Orlicz spaces over bounded non-smooth domains. Moreover, we prove a global regularity result for asymptotically regular problems which are getting close to the regular problems considered, when the gradient variable goes to infinity.
Low Complexity Sparse Bayesian Learning for Channel Estimation Using Generalized Mean Field
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri
2014-01-01
We derive low complexity versions of a wide range of algorithms for sparse Bayesian learning (SBL) in underdetermined linear systems. The proposed algorithms are obtained by applying the generalized mean field (GMF) inference framework to a generic SBL probabilistic model. In the GMF framework, we...
Thomas B. Lynch; Jeffrey H. Gove
2014-01-01
The typical "double counting" application of the mirage method of boundary correction cannot be applied to sampling systems such as critical height sampling (CHS) that are based on a Monte Carlo sample of a tree (or debris) attribute because the critical height (or other random attribute) sampled from a mirage point is generally not equal to the critical...
Roberts, James S.; Bao, Han; Huang, Chun-Wei; Gagne, Phill
Characteristic curve approaches for linking parameters from the generalized partial credit model were examined for cases in which common (anchor) items are calibrated separately in two groups. Three of these approaches are simple extensions of the test characteristic curve (TCC), item characteristic curve (ICC), and operating characteristic curve…
Felleki, M; Lee, D; Lee, Y; Gilmour, A R; Rönnegård, L
2012-12-01
The possibility of breeding for uniform individuals by selecting animals expressing a small response to environment has been studied extensively in animal breeding. Bayesian methods for fitting models with genetic components in the residual variance have been developed for this purpose, but have limitations due to the computational demands. We use the hierarchical (h)-likelihood from the theory of double hierarchical generalized linear models (DHGLM) to derive an estimation algorithm that is computationally feasible for large datasets. Random effects for both the mean and residual variance parts of the model are estimated together with their variance/covariance components. An important feature of the algorithm is that it can fit a correlation between the random effects for mean and variance. An h-likelihood estimator is implemented in the R software and an iterative reweighted least square (IRWLS) approximation of the h-likelihood is implemented using ASReml. The difference in variance component estimates between the two implementations is investigated, as well as the potential bias of the methods, using simulations. IRWLS gives the same results as h-likelihood in simple cases with no severe indication of bias. For more complex cases, only IRWLS could be used, and bias did appear. The IRWLS is applied on the pig litter size data previously analysed by Sorensen & Waagepetersen (2003) using Bayesian methodology. The estimates we obtained by using IRWLS are similar to theirs, with the estimated correlation between the random genetic effects being -0·52 for IRWLS and -0·62 in Sorensen & Waagepetersen (2003).
Rabideau, Dustin J; Pei, Pamela P; Walensky, Rochelle P; Zheng, Amy; Parker, Robert A
2018-02-01
The expected value of sample information (EVSI) can help prioritize research but its application is hampered by computational infeasibility, especially for complex models. We investigated an approach by Strong and colleagues to estimate EVSI by applying generalized additive models (GAM) to results generated from a probabilistic sensitivity analysis (PSA). For 3 potential HIV prevention and treatment strategies, we estimated life expectancy and lifetime costs using the Cost-effectiveness of Preventing AIDS Complications (CEPAC) model, a complex patient-level microsimulation model of HIV progression. We fitted a GAM-a flexible regression model that estimates the functional form as part of the model fitting process-to the incremental net monetary benefits obtained from the CEPAC PSA. For each case study, we calculated the expected value of partial perfect information (EVPPI) using both the conventional nested Monte Carlo approach and the GAM approach. EVSI was calculated using the GAM approach. For all 3 case studies, the GAM approach consistently gave similar estimates of EVPPI compared with the conventional approach. The EVSI behaved as expected: it increased and converged to EVPPI for larger sample sizes. For each case study, generating the PSA results for the GAM approach required 3 to 4 days on a shared cluster, after which EVPPI and EVSI across a range of sample sizes were evaluated in minutes. The conventional approach required approximately 5 weeks for the EVPPI calculation alone. Estimating EVSI using the GAM approach with results from a PSA dramatically reduced the time required to conduct a computationally intense project, which would otherwise have been impractical. Using the GAM approach, we can efficiently provide policy makers with EVSI estimates, even for complex patient-level microsimulation models.
Komendera, Erik E.; Adhikari, Shaurav; Glassner, Samantha; Kishen, Ashwin; Quartaro, Amy
2017-01-01
Autonomous robotic assembly by mobile field robots has seen significant advances in recent decades, yet practicality remains elusive. Identified challenges include better use of state estimation to and reasoning with uncertainty, spreading out tasks to specialized robots, and implementing representative joining methods. This paper proposes replacing 1) self-correcting mechanical linkages with generalized joints for improved applicability, 2) assembly serial manipulators with parallel manipulators for higher precision and stability, and 3) all-in-one robots with a heterogeneous team of specialized robots for agent simplicity. This paper then describes a general assembly algorithm utilizing state estimation. Finally, these concepts are tested in the context of solar array assembly, requiring a team of robots to assemble, bond, and deploy a set of solar panel mockups to a backbone truss to an accuracy not built into the parts. This paper presents the results of these tests.
Kittisuwan, Pichid
2015-03-01
The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.
Directory of Open Access Journals (Sweden)
Qingwu Gao
2012-01-01
Full Text Available We discuss the uniformly asymptotic estimate of the finite-time ruin probability for all times in a generalized compound renewal risk model, where the interarrival times of successive accidents and all the claim sizes caused by an accident are two sequences of random variables following a wide dependence structure. This wide dependence structure allows random variables to be either negatively dependent or positively dependent.
Estimation of Disability Weights in the General Population of South Korea Using a Paired Comparison
Ock, Minsu; Ahn, Jeonghoon; Yoon, Seok-Jun; Jo, Min-Woo
2016-01-01
We estimated the disability weights in the South Korean population by using a paired comparison-only model wherein ‘full health’ and ‘being dead’ were included as anchor points, without resorting to a cardinal method, such as person trade-off. The study was conducted via 2 types of survey: a household survey involving computer-assisted face-to-face interviews and a web-based survey (similar to that of the GBD 2010 disability weight study). With regard to the valuation methods, paired comparison, visual analogue scale (VAS), and standard gamble (SG) were used in the household survey, whereas paired comparison and population health equivalence (PHE) were used in the web-based survey. Accordingly, we described a total of 258 health states, with ‘full health’ and ‘being dead’ designated as anchor points. In the analysis, 4 models were considered: a paired comparison-only model; hybrid model between paired comparison and PHE; VAS model; and SG model. A total of 2,728 and 3,188 individuals participated in the household and web-based survey, respectively. The Pearson correlation coefficients of the disability weights of health states between the GBD 2010 study and the current models were 0.802 for Model 2, 0.796 for Model 1, 0.681 for Model 3, and 0.574 for Model 4 (all P-valuesdisability weights in South Korea, and for maintaining simplicity in the analysis. Thus, disability weights can be more easily estimated by using paired comparison alone, with ‘full health’ and ‘being dead’ as one of the health states. As noted in our study, we believe that additional evidence regarding the universality of disability weight can be observed by using a simplified methodology of estimating disability weights. PMID:27606626
Estimation of Disability Weights in the General Population of South Korea Using a Paired Comparison.
Ock, Minsu; Ahn, Jeonghoon; Yoon, Seok-Jun; Jo, Min-Woo
2016-01-01
We estimated the disability weights in the South Korean population by using a paired comparison-only model wherein 'full health' and 'being dead' were included as anchor points, without resorting to a cardinal method, such as person trade-off. The study was conducted via 2 types of survey: a household survey involving computer-assisted face-to-face interviews and a web-based survey (similar to that of the GBD 2010 disability weight study). With regard to the valuation methods, paired comparison, visual analogue scale (VAS), and standard gamble (SG) were used in the household survey, whereas paired comparison and population health equivalence (PHE) were used in the web-based survey. Accordingly, we described a total of 258 health states, with 'full health' and 'being dead' designated as anchor points. In the analysis, 4 models were considered: a paired comparison-only model; hybrid model between paired comparison and PHE; VAS model; and SG model. A total of 2,728 and 3,188 individuals participated in the household and web-based survey, respectively. The Pearson correlation coefficients of the disability weights of health states between the GBD 2010 study and the current models were 0.802 for Model 2, 0.796 for Model 1, 0.681 for Model 3, and 0.574 for Model 4 (all P-valuesdisability weights in South Korea, and for maintaining simplicity in the analysis. Thus, disability weights can be more easily estimated by using paired comparison alone, with 'full health' and 'being dead' as one of the health states. As noted in our study, we believe that additional evidence regarding the universality of disability weight can be observed by using a simplified methodology of estimating disability weights.
Weber, Lisa C.; Wiley, Michael J.; Wilcox, Douglas A.
2016-01-01
The use of diurnal water-table fluctuation methods to calculate evapotranspiration (ET) and groundwater flow is of increasing interest in ecohydrological studies. Most studies of this type, however, have been located in riparian wetlands of semi-arid regions where groundwater levels are consistently below topographic surface elevations and precipitation events are infrequent. Current methodologies preclude application to a wider variety of wetland systems. In this study, we extended a method for estimating sub-daily ET and groundwater flow rates from water-level fluctuations to fit highly dynamic, non-riparian wetland scenarios. Modifications included (1) varying the specific yield to account for periodic flooded conditions and (2) relating empirically derived ET to estimated potential ET for days when precipitation events masked the diurnal signal. To demonstrate the utility of this method, we estimated ET and groundwater fluxes over two growing seasons (2006–2007) in 15 wetlands within a ridge-and-swale wetland complex of the Laurentian Great Lakes under flooded and non-flooded conditions. Mean daily ET rates for the sites ranged from 4.0 mm d−1 to 6.6 mm d−1. Shallow groundwater discharge rates resulting from evaporative demand ranged from 2.5 mm d−1 to 4.3 mm d−1. This study helps to expand our understanding of the evapotranspirative demand of plants under various hydrologic and climate conditions. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Gillman, Max; Otto, Glen
2006-01-01
The paper presents and tests a theory of the demand for money that is derived from a general equilibrium, endogenous growth economy, which in effect combines a special case of the shopping time exchange economy with the cash-in-advance framework. The model predicts that both higher inflation and financial innovation - that reduces the cost of credit - induce agents to substitute away from money towards exchange credit. The implied interest elasticity of money demand rises with the inflation r...
Turning Simulation into Estimation: Generalized Exchange Algorithms for Exponential Family Models.
Directory of Open Access Journals (Sweden)
Maarten Marsman
Full Text Available The Single Variable Exchange algorithm is based on a simple idea; any model that can be simulated can be estimated by producing draws from the posterior distribution. We build on this simple idea by framing the Exchange algorithm as a mixture of Metropolis transition kernels and propose strategies that automatically select the more efficient transition kernels. In this manner we achieve significant improvements in convergence rate and autocorrelation of the Markov chain without relying on more than being able to simulate from the model. Our focus will be on statistical models in the Exponential Family and use two simple models from educational measurement to illustrate the contribution.
Estimation of Disability Weights in the General Population of South Korea Using a Paired Comparison.
Directory of Open Access Journals (Sweden)
Minsu Ock
Full Text Available We estimated the disability weights in the South Korean population by using a paired comparison-only model wherein 'full health' and 'being dead' were included as anchor points, without resorting to a cardinal method, such as person trade-off. The study was conducted via 2 types of survey: a household survey involving computer-assisted face-to-face interviews and a web-based survey (similar to that of the GBD 2010 disability weight study. With regard to the valuation methods, paired comparison, visual analogue scale (VAS, and standard gamble (SG were used in the household survey, whereas paired comparison and population health equivalence (PHE were used in the web-based survey. Accordingly, we described a total of 258 health states, with 'full health' and 'being dead' designated as anchor points. In the analysis, 4 models were considered: a paired comparison-only model; hybrid model between paired comparison and PHE; VAS model; and SG model. A total of 2,728 and 3,188 individuals participated in the household and web-based survey, respectively. The Pearson correlation coefficients of the disability weights of health states between the GBD 2010 study and the current models were 0.802 for Model 2, 0.796 for Model 1, 0.681 for Model 3, and 0.574 for Model 4 (all P-values<0.001. The discrimination of values according to health state severity was most suitable in Model 1. Based on these results, the paired comparison-only model was selected as the best model for estimating disability weights in South Korea, and for maintaining simplicity in the analysis. Thus, disability weights can be more easily estimated by using paired comparison alone, with 'full health' and 'being dead' as one of the health states. As noted in our study, we believe that additional evidence regarding the universality of disability weight can be observed by using a simplified methodology of estimating disability weights.
Cost benchmarking of railway projects in Europe – dealing with uncertainties in cost estimates
DEFF Research Database (Denmark)
Trabo, Inara
Past experiences in the construction of high-speed railway projects demontrate either positive or negative financial outcomes of the actual project’s budget. Usually some uncertainty value is included into initial budget calculations. Uncertainty is related to the increase of material prices...... per main cost drivers were compared and analyzed. There were observed nine railway projects, comparable to the Copenhagen-Ringsted project. The results of this comparison provided a certain overview on the cost range in different budget disciplines. The Copenhagen-Ringsted project is positioned right...
International Nuclear Information System (INIS)
Thas, Koen
2006-01-01
In Am. Math. Monthly (73 1-23 (1966)), Kac asked his famous question 'Can one hear the shape of a drum?', which was eventually answered negatively in Gordon et al (1992 Invent. Math. 110 1-22) by construction of planar isospectral pairs. Giraud (2005 J. Phys. A: Math. Gen. 38 L477-83) observed that most of the known examples can be generated from solutions of a certain equation which involves a set of involutions of an n-dimensional projective space over some finite field. He then generated all possible solutions for n = 2, when the involutions fix the same number of points. In Thas (2006 J. Phys. A: Math. Gen. 39 L385-8) we showed that no other examples arise for any other dimension, still assuming that the involutions fix the same number of points. In this paper we study the problem for involutions not necessarily fixing the same number of points, and solve the problem completely
A generalized public goods game with coupling of individual ability and project benefit
Zhong, Li-Xin; Xu, Wen-Juan; He, Yun-Xin; Zhong, Chen-Yang; Chen, Rong-Da; Qiu, Tian; Shi, Yong-Dong; Ren, Fei
2017-08-01
Facing a heavy task, any single person can only make a limited contribution and team cooperation is needed. As one enjoys the benefit of the public goods, the potential benefits of the project are not always maximized and may be partly wasted. By incorporating individual ability and project benefit into the original public goods game, we study the coupling effect of the four parameters, the upper limit of individual contribution, the upper limit of individual benefit, the needed project cost and the upper limit of project benefit on the evolution of cooperation. Coevolving with the individual-level group size preferences, an increase in the upper limit of individual benefit promotes cooperation while an increase in the upper limit of individual contribution inhibits cooperation. The coupling of the upper limit of individual contribution and the needed project cost determines the critical point of the upper limit of project benefit, where the equilibrium frequency of cooperators reaches its highest level. Above the critical point, an increase in the upper limit of project benefit inhibits cooperation. The evolution of cooperation is closely related to the preferred group-size distribution. A functional relation between the frequency of cooperators and the dominant group size is found.
International Nuclear Information System (INIS)
Dai Hao; Jia Li-Xin; Zhang Yan-Bin
2012-01-01
The adaptive generalized matrix projective lag synchronization between two different complex networks with non-identical nodes and different dimensions is investigated in this paper. Based on Lyapunov stability theory and Barbalat's lemma, generalized matrix projective lag synchronization criteria are derived by using the adaptive control method. Furthermore, each network can be undirected or directed, connected or disconnected, and nodes in either network may have identical or different dynamics. The proposed strategy is applicable to almost all kinds of complex networks. In addition, numerical simulation results are presented to illustrate the effectiveness of this method, showing that the synchronization speed is sensitively influenced by the adaptive law strength, the network size, and the network topological structure. (general)
International Nuclear Information System (INIS)
Stevens, J. L.; Titus, R.; Sanford, P. C.
2002-01-01
The Rocky Flats Closure Site is implementing an aggressive approach in an attempt to complete Site closure by 2006. The replanning effort to meet this goal required that the life-cycle decommissioning effort for the Site and for the major individual facilities be reexamined in detail. As part of the overall effort, the cost estimate for the Building 771 decommissioning project was revised to incorporate both actual cost data from a recently-completed similar project and detailed planning for all activities. This paper provides a brief overview of the replanning process and the original estimate, and then discusses the modifications to that estimate to reflect new data, methods, and planning rigor. It provides the new work breakdown structure and discusses the reasons for the final arrangement chosen. It follows with the process used to assign scope, cost, and schedule elements within the new structure, and development of the new code of accounts. Finally, it describes the project control methodology used to track the project, and provides lessons learned on cost tracking in the decommissioning environment
Scharfenberg, Janna; Schaper, Katharina; Krummenauer, Frank
2014-01-01
The German "Dr med" plays a specific role in doctoral thesis settings since students may start the underlying doctoral project during their studies at medical school. If a Medical Faculty principally encourages this approach, then it should support the students in performing the respective projects as efficiently as possible. Consequently, it must be ensured that students are able to implement and complete a doctoral project in parallel to their studies. As a characteristic efficiency feature of these "Dr med" initiatives, the proportion of doctoral projects successfully completed shortly after graduating from medical school is proposed and illustrated. The proposed characteristic can be estimated by the time period between the state examination (date of completion of the qualifying medical examination) and the doctoral examination. Completion of the doctoral project "during their medical studies" was then characterised by a doctoral examination no later than 12 months after the qualifying medical state examination. To illustrate the estimation and interpretation of this characteristic, it was retrospectively estimated on the basis of the full sample of all doctorates successfully completed between July 2009 and June 2012 at the Department of Human Medicine at the Faculty of Health of the University of Witten/Herdecke. During the period of investigation defined, a total number of 56 doctoral examinations were documented, 30 % of which were completed within 12 months after the qualifying medical state examination (95% confidence interval 19 to 44 %). The median duration between state and doctoral examination was 27 months. The proportion of doctoral projects completed parallel to the medical studies increased during the investigation period from 14 % in the first year (July 2009 till June 2010) to 40 % in the third year (July 2011 till June 2012). Only about a third of all "Dr med" projects at the Witten/Herdecke Faculty of Health were completed during or close to
DEFF Research Database (Denmark)
Blasone, Roberta-Serena; Vrugt, Jasper A.; Madsen, Henrik
2008-01-01
propose an alternative strategy to determine the value of the cutoff threshold based on the appropriate coverage of the resulting uncertainty bounds. We demonstrate the superiority of this revised GLUE method with three different conceptual watershed models of increasing complexity, using both synthetic......In the last few decades hydrologists have made tremendous progress in using dynamic simulation models for the analysis and understanding of hydrologic systems. However, predictions with these models are often deterministic and as such they focus on the most probable forecast, without an explicit...... of applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models...
International Nuclear Information System (INIS)
Shabestani Monfared, A.; Abdi, R.
2006-01-01
The risks of low-dose Ionizing radiation from radiology and nuclear medicine are not clearly determined. Effective dose to population is a very important factor in risk estimation. The study aimed to determine the effective dose from diagnostic radiation medicine in a northern province of Iran. Materials and Methods: Data about various radiologic and nuclear medicine procedures were collected from all radiology and nuclear medicine departments In Mazandaran Province (population = 2,898,031); and using the standard dosimetry tables, the total dose, dose per examination, and annual effective dose per capita as well as the annual gonadal dose per capita were estimated. Results: 655,730 radiologic examinations in a year's period, lead to 1.45 mSv, 0.33 mSv and 0.31 mGy as average effective dose per examination, annual average effective dose to member of the public, and annual average gonadal dose per capita, respectively. The frequency of medical radiologic examinations was 2,262 examinations annually per 10,000 members of population. However, the total number of nuclear medicine examinations in the same period was 7074, with 4.37 mSv, 9.6 μSv and 9.8 μGy, as average effective dose per examination, annual average effective dose to member of the public and annual average gonadal dose per caput, respectively. The frequency of nuclear medicine examination was 24 examinations annually per 10,000 members of population. Conclusion: The average effective dose per examination was nearly similar to other studies. However, the average annual effective dose and annual average gonadal dose per capita were less than the similar values in other reports, which could be due to lesser number of radiation medicine examinations in the present study
Energy Technology Data Exchange (ETDEWEB)
Toh, K.C.; Trefethen, L.N. [Cornell Univ., Ithaca, NY (United States)
1994-12-31
What properties of a nonsymmetric matrix A determine the convergence rate of iterations such as GMRES, QMR, and Arnoldi? If A is far from normal, should one replace the usual Ritz values {r_arrow} eigenvalues notion of convergence of Arnoldi by alternative notions such as Arnoldi lemniscates {r_arrow} pseudospectra? Since Krylov subspace iterations can be interpreted as minimization processes involving polynomials of matrices, the answers to questions such as these depend upon mathematical problems of the following kind. Given a polynomial p(z), how can one bound the norm of p(A) in terms of (1) the size of p(z) on various sets in the complex plane, and (2) the locations of the spectrum and pseudospectra of A? This talk reports some progress towards solving these problems. In particular, the authors present theorems that generalize the Kreiss matrix theorem from the unit disk (for the monomial A{sup n}) to a class of general complex domains (for polynomials p(A)).
Directory of Open Access Journals (Sweden)
E. Yu. Antipenko
2010-03-01
Full Text Available In the article the structural analysis of efficiency indices elements of organization-and-technology solutions of construction project scheduling is executed for preparation of high-quality base of providing the planning processes and subsequent realization of the projects.
Carbon accounting and cost estimation in forestry projects using CO2Fix V.3
Groen, T.A.; Nabuurs, G.J.; Schelhaas, M.J.
2006-01-01
Carbon and financial accounting of projects in the Land Use, Land-Use Change and Forestry sector is a topic of hot debate. Large uncertainty remains concerning the carbon dynamics, the way they should be accounted and the cost efficiency of the projects. Part of the uncertainty can be alleviated by
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Han, Wenhua; Shen, Xiaohui; Xu, Jun; Wang, Ping; Tian, Guiyun; Wu, Zhengyang
2014-09-04
Magnetic flux leakage (MFL) inspection is one of the most important and sensitive nondestructive testing approaches. For online MFL inspection of a long-range railway track or oil pipeline, a fast and effective defect profile estimating method based on a multi-power affine projection algorithm (MAPA) is proposed, where the depth of a sampling point is related with not only the MFL signals before it, but also the ones after it, and all of the sampling points related to one point appear as serials or multi-power. Defect profile estimation has two steps: regulating a weight vector in an MAPA filter and estimating a defect profile with the MAPA filter. Both simulation and experimental data are used to test the performance of the proposed method. The results demonstrate that the proposed method exhibits high speed while maintaining the estimated profiles clearly close to the desired ones in a noisy environment, thereby meeting the demand of accurate online inspection.
Directory of Open Access Journals (Sweden)
Horim Yi
2017-10-01
Full Text Available OBJECTIVES This study aims to investigate health disparities between lesbian, gay, and bisexual (LGB adults and the general population in Korea, where there is low public acceptance of sexual minorities and a lack of research on the health of sexual minorities. METHODS The research team conducted a nationwide survey of 2,335 Korean LGB adults in 2016. Using the dataset, we estimated the age-standardized prevalence ratios (SPRs for poor self-rated health, musculoskeletal pain, depressive symptoms, suicidal behaviors, smoking, and hazardous drinking. We then compared the SPRs of the LGB adults and the general population which participated in three different nationally representative surveys in Korea. SPRs were estimated for each of the four groups (i.e., gay men, bisexual men, lesbians, and bisexual women. RESULTS Korean LGB adults exhibited a statistically significantly higher prevalence of depressive symptoms, suicidal ideation and attempts, and musculoskeletal pain than the general population. Lesbian and bisexual women had a higher risk of poor self-rated health and smoking than the general women population, whereas gay and bisexual men showed no differences with the general men population. Higher prevalence of hazardous drinking was observed among lesbians, gay men, and bisexual women compared to the general population, but was not observed in bisexual men. CONCLUSIONS The findings suggest that LGB adults have poorer health conditions compared to the general population in Korea. These results suggest that interventions are needed to address the health disparities of Korean LGB adults.
Yi, Horim; Lee, Hyemin; Park, Jooyoung; Choi, Bokyoung; Kim, Seung-Sup
2017-01-01
This study aims to investigate health disparities between lesbian, gay, and bisexual (LGB) adults and the general population in Korea, where there is low public acceptance of sexual minorities and a lack of research on the health of sexual minorities. The research team conducted a nationwide survey of 2,335 Korean LGB adults in 2016. Using the dataset, we estimated the age-standardized prevalence ratios (SPRs) for poor self-rated health, musculoskeletal pain, depressive symptoms, suicidal behaviors, smoking, and hazardous drinking. We then compared the SPRs of the LGB adults and the general population which participated in three different nationally representative surveys in Korea. SPRs were estimated for each of the four groups (i.e., gay men, bisexual men, lesbians, and bisexual women). Korean LGB adults exhibited a statistically significantly higher prevalence of depressive symptoms, suicidal ideation and attempts, and musculoskeletal pain than the general population. Lesbian and bisexual women had a higher risk of poor self-rated health and smoking than the general women population, whereas gay and bisexual men showed no differences with the general men population. Higher prevalence of hazardous drinking was observed among lesbians, gay men, and bisexual women compared to the general population, but was not observed in bisexual men. The findings suggest that LGB adults have poorer health conditions compared to the general population in Korea. These results suggest that interventions are needed to address the health disparities of Korean LGB adults.
Evaluation of generalized degrees of freedom for sparse estimation by replica method
Sakata, A.
2016-12-01
We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.
Anderson, D Mark; Elsea, David
2015-12-01
In this note, we use data from the national and state Youth Risk Behavior Surveys for the period 1999 through 2011 to estimate the relationship between the Meth Project, an anti-methamphetamine advertising campaign, and meth use among high school students. During this period, a total of eight states adopted anti-meth advertising campaigns. After accounting for pre-existing downward trends in meth use, we find little evidence that the campaign curbed meth use in the full sample. We do find, however, some evidence that the Meth Project may have decreased meth use among White high school students. Copyright © 2014 John Wiley & Sons, Ltd.
Energy Technology Data Exchange (ETDEWEB)
Mahowald, Natalie [Cornell Univ., Ithaca, NY (United States)
2016-11-29
Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogen balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in
Variational estimation of process parameters in a simplified atmospheric general circulation model
Lv, Guokun; Koehl, Armin; Stammer, Detlef
2016-04-01
Parameterizations are used to simulate effects of unresolved sub-grid-scale processes in current state-of-the-art climate model. The values of the process parameters, which determine the model's climatology, are usually manually adjusted to reduce the difference of model mean state to the observed climatology. This process requires detailed knowledge of the model and its parameterizations. In this work, a variational method was used to estimate process parameters in the Planet Simulator (PlaSim). The adjoint code was generated using automatic differentiation of the source code. Some hydrological processes were switched off to remove the influence of zero-order discontinuities. In addition, the nonlinearity of the model limits the feasible assimilation window to about 1day, which is too short to tune the model's climatology. To extend the feasible assimilation window, nudging terms for all state variables were added to the model's equations, which essentially suppress all unstable directions. In identical twin experiments, we found that the feasible assimilation window could be extended to over 1-year and accurate parameters could be retrieved. Although the nudging terms transform to a damping of the adjoint variables and therefore tend to erases the information of the data over time, assimilating climatological information is shown to provide sufficient information on the parameters. Moreover, the mechanism of this regularization is discussed.
Asymptotic scaling properties and estimation of the generalized Hurst exponents in financial data
Buonocore, R. J.; Aste, T.; Di Matteo, T.
2017-04-01
We propose a method to measure the Hurst exponents of financial time series. The scaling of the absolute moments against the aggregation horizon of real financial processes and of both uniscaling and multiscaling synthetic processes converges asymptotically towards linearity in log-log scale. In light of this we found appropriate a modification of the usual scaling equation via the introduction of a filter function. We devised a measurement procedure which takes into account the presence of the filter function without the need of directly estimating it. We verified that the method is unbiased within the errors by applying it to synthetic time series with known scaling properties. Finally we show an application to empirical financial time series where we fit the measured scaling exponents via a second or a fourth degree polynomial, which, because of theoretical constraints, have respectively only one and two degrees of freedom. We found that on our data set there is not clear preference between the second or fourth degree polynomial. Moreover the study of the filter functions of each time series shows common patterns of convergence depending on the momentum degree.
International Nuclear Information System (INIS)
Dragulescu, Emilian; Dragusin, Mitica; Popa, Victor; Boicu, Alin; Tuca, Carmen; Iorga, Ioan; Vrabie, Ionut; Mustata, Carmen
2003-01-01
A decommissioning project was worked out concerning the nuclear facility research reactor WWR-S Magurele-Bucharest to remove the radioactive and hazardous materials and so to exclude any risk for human health and environment. The project involves the four phases named assessment, development, operations and closeout. There are two major parts to the assesment phase: preliminary characterisation and the review and decision-making process. Characterisation is needed to develop project baseline data, which should include sufficient chemical, physical, and radiological characterisation to meet planning needs. Based on the conclusions of these studies, possible decommissioning alternative will be analyzed and: the best alternative chosen, final goal identified, risk assessments are evaluated. Also, taken into account are: regulations supporting assessment, land use considerations, financial concerns, disposal availability, public involvement, technology developments. After a decommissioning alternative was chosen, detailed engineering will begin following appropriate regulatory guidance. The plan will include characterisation information, namely: review of decommissioning alternatives; justification for the selected alternative; provision for regulatory compliance; predictions of personnel exposure, radioactive waste volume, and cost. Other activities are: scheduling, preparation for decommissioning operations; coordination, documentation, characterization report, feasibility studies, Decommissioning Plan, project daily report, radiological survey, airborne sampling records, termination survey of the site. The operations imply: identification and sequencing the operations on contaminated materials, storing on site the wastes, awaiting processing or disposal, and packaging of materials for transport to processing or disposal facilities.The key operations are: worker protection, health and safety program, review of planing work, work area assessment, work area controls
Minh, Nghia Pham; Zou, Bin; Cai, Hongjun; Wang, Chengyi
2014-01-01
The estimation of forest parameters over mountain forest areas using polarimetric interferometric synthetic aperture radar (PolInSAR) images is one of the greatest interests in remote sensing applications. For mountain forest areas, scattering mechanisms are strongly affected by the ground topography variations. Most of the previous studies in modeling microwave backscattering signatures of forest area have been carried out over relatively flat areas. Therefore, a new algorithm for the forest height estimation from mountain forest areas using the general model-based decomposition (GMBD) for PolInSAR image is proposed. This algorithm enables the retrieval of not only the forest parameters, but also the magnitude associated with each mechanism. In addition, general double- and single-bounce scattering models are proposed to fit for the cross-polarization and off-diagonal term by separating their independent orientation angle, which remains unachieved in the previous model-based decompositions. The efficiency of the proposed approach is demonstrated with simulated data from PolSARProSim software and ALOS-PALSAR spaceborne PolInSAR datasets over the Kalimantan areas, Indonesia. Experimental results indicate that forest height could be effectively estimated by GMBD.
The QOL-DASS Model to Estimate Overall Quality of Life and General Subjective Health.
Mazaheri, Mehrdad
2011-01-01
In Order to find how rating the WHOQOL-BREF and DASS scales are combined to produce an overall measure of quality of life and satisfaction with health rating, a QOL-DASS model was designed; and the strength of this hypothesized model was examined using the structural equation modeling. Participants included a sample of 103 voluntary males who were divided into two groups of unhealthy (N=55) and healthy (N=48). To assess satisfaction and negative emotions of depression, anxiety and stress among the participants, they were asked to fill out the WHOQOL-BREF and The Depression Anxiety Stress Scale (DASS-42). Our findings on running the hypothesized model of QOL-DASS indicated that the proposed model of QOL-DASS fitted the data well for the both healthy and unhealthy groups. Our findings with CFA to evaluate the hypothesized model of QOL-DASS indicated that the different satisfaction domain ratings and the negative emotions of depression, anxiety and stress as the observed variables can represent the underlying constructs of general health and quality of life on both healthy and unhealthy groups.
The QOL-DASS Model to Estimate Overall Quality of Life and General Health
Directory of Open Access Journals (Sweden)
Mehrdad Mazaheri
2011-01-01
Full Text Available "n Objective: In order to find how rating the WHOQOL-BREF and DASS scales are combined to produce an overall measure of quality of life and satisfaction with health rating, a QOL-DASS model was designed ; and the strength of this hypothesized model was examined using the structural equation modeling "n "nMethod: Participants included a sample of 103 voluntary males who were divided into two groups of unhealthy (N=55 and healthy (N=48. To assess satisfaction and negative emotions of depression, anxiety and stress among the participants, they were asked to fill out the WHOQOLBREF and The Depression Anxiety Stress Scale (DASS-42. "nResults: Our findings on running the hypothesized model of QOL-DASS indicated that the proposed model of QOL-DASS fitted the data well for the both healthy and unhealthy groups "nConclusion: Our findings with CFA to evaluate the hypothesized model of QOL-DASS indicated that the different satisfaction domain ratings and the negative emotions of depression, anxiety and stress as the observed variables can represent the underlying constructs of general health and quality of life on both healthy and unhealthy groups.
Estimating Required Contingency Funds for Construction Projects using Multiple Linear Regression
National Research Council Canada - National Science Library
Cook, Jason J
2006-01-01
Cost overruns are a critical problem for construction projects. The common practice for dealing with cost overruns is the assignment of an arbitrary flat percentage of the construction budget as a contingency fund...
Cost estimate modeling of transportation management plans for highway projects : [research brief].
2012-05-01
Highway rehabilitation and reconstruction projects frequently cause road congestion and increase safety concerns while limiting access for road users. State Transportation Agencies (STAs) are challenged to find safer and more efficient ways to renew ...
2013-04-01
Solar PV; UESC Navy Marine Corps Logistics Base (MCLogB) Albany GA Renewable Energy Cogeneration ESPC using Biogas PPA Navy MCLogB Albany GA...Armed Services Committee directed GAO to assess the impact of base closures on such agreements and how DOD captures costs associated with projects...this analysis and GAO’s case study review, liabilities will likely exist for renewable energy and privatized utility projects in the event of base
Carbon accounting and cost estimation in forestry projects using CO2Fix V.3
Groen, T.A.; Nabuurs, G.J.; Schelhaas, M.J.
2006-01-01
Carbon and financial accounting of projects in the Land Use, Land-Use Change and Forestry sector is a topic of hot debate. Large uncertainty remains concerning the carbon dynamics, the way they should be accounted and the cost efficiency of the projects. Part of the uncertainty can be alleviated by standardisation and transparency of reporting methods. For this reason we further developed CO2FIX, a forest ecosystem carbon model, with modules for carbon and financial accounting. The model is a...
D'Isanto, A.; Polsterer, K. L.
2018-01-01
Context. The need to analyze the available large synoptic multi-band surveys drives the development of new data-analysis methods. Photometric redshift estimation is one field of application where such new methods improved the results, substantially. Up to now, the vast majority of applied redshift estimation methods have utilized photometric features. Aims: We aim to develop a method to derive probabilistic photometric redshift directly from multi-band imaging data, rendering pre-classification of objects and feature extraction obsolete. Methods: A modified version of a deep convolutional network was combined with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) were applied as performance criteria. We have adopted a feature based random forest and a plain mixture density network to compare performances on experiments with data from SDSS (DR9). Results: We show that the proposed method is able to predict redshift PDFs independently from the type of source, for example galaxies, quasars or stars. Thereby the prediction performance is better than both presented reference methods and is comparable to results from the literature. Conclusions: The presented method is extremely general and allows us to solve of any kind of probabilistic regression problems based on imaging data, for example estimating metallicity or star formation rate of galaxies. This kind of methodology is tremendously important for the next generation of surveys.
International Nuclear Information System (INIS)
Ellenbroek, R.; Ballard-Tremeer, G.; Koeks, R.; Venendaal, R.
2000-08-01
The purpose of this guide is to provide information on the possibilities to invest and carry out biomass energy projects in Central and Eastern Europe. In the first part of the guide background information is given on countries in Central and Eastern Europe, focusing on bio-energy. A few cases are presented to illustrate different biomass energy concepts. Based on economic calculations an indication is given of the feasibility of those concepts. Also the most relevant sources of information are listed. In the second part an overview is given of Dutch, European and international financial tools that can be used in biomass energy projects in Central and Eastern Europe
A problem of finding an acceptable variant in generalized project networks
Directory of Open Access Journals (Sweden)
David Blokh
2005-01-01
Full Text Available A project network often has some activities or groups of activities which can be performed at different stages of the project. Then, the problem of finding an optimal/acceptable time or/and optimal/acceptable order of such an activity or a group of activities arises. Such a problem emerges, in particular, in house-building management when the beginnings of some activities may vary in time or/and order. We consider a mathematical formulation of the problem, show its computational complexity, and describe an algorithm for solving the problem.
International Nuclear Information System (INIS)
Gaussen, J.L.
2001-01-01
Bure URL project is one of the components of the French research program dedicated to the study of HLLLW (High Level Long Lived Radioactive Waste) disposal in geologic repository within the framework of the 1991 Radioactive Waste Act. Pursuant to the said act, the objective of the URL project is to participate in the ''evaluation of options for retrievable or non- retrievable disposal in deep geologic formations''. More precisely, the goal of this URL, which is situated 300 km East of Paris, is to gain a better knowledge of a site capable of hosting a geologic repository. (author)
Availability estimation of repairable systems using reliability graph with general gates (RGGG)
International Nuclear Information System (INIS)
Goh, Gyoung Tae
2009-02-01
By performing risk analysis, we may obtain sufficient information about the system to redesign it and lower the probability of the occurrence of an accident or mitigate the ensuing consequences. The concept of reliability is widely used to express risk of systems. The reliability is used for non-repairable systems. But nuclear power plant systems are repairable systems. With repairable systems, repairable components can improve the availability of a system because faults that are generated in components can be recovered. Hence, the availability of the system is more proper concept in case of repairable systems. Reliability graph with general gate (RGGG) is one of the system reliability analysis methods. The RGGG is a very intuitiveness method as compared with other methods. But the RGGG has not been applied to repairable systems yet. The objective of this study is to extend the RGGG in order to enable one to analyze repairable system. Determining the probability table for each node is a critical process to calculate the system availability in the RGGG method. Therefore finding the proper algorithms and making probability tables in various situations are the major a part of this study. The other part is an example of applying RGGG method to a real system. We find the proper algorithms and probability tables for independent repairable systems, dependent series repairable systems, and k-out-of-m (K/M) redundant parallel repairable systems in this study. We can evaluate the availability of real system using these probability tables. An example for a real system is shown in the latter part of this study. For the purpose of this analysis, the charging pumps subsystem of the chemical and volume control system (CVCS) was selected. The RGGG method extended for repairable systems has the same characteristic of intuitiveness as the original RGGG method and we can confirm that the availability analysis result from the repairable RGGG method is exact
Yin, Honglian; Sun, Aihua; Liu, Quanru; Chen, Zhiyi
2018-03-01
It is the key of motivating sub-contractors working hard and mutual cooperation, ensuring implementation overall goal of the project that to design rational incentive mechanism for general contractor. Based on the principal-agency theory, the subcontractor efforts is divided into two parts, one for individual efforts, another helping other subcontractors, team Cooperation incentive models of multiple subcontractors are set up, incentive schemes and intensities are also given. The results show that the general contractor may provide individual and team motivation incentives when subcontractors working independently, not affecting each other in time and space; otherwise, the general contractor may only provide individual incentive to entice teams collaboration between subcontractors and helping each other. The conclusions can provide a reference for the subcontract design of general and sub-contractor dynamic alliances.
Sutton, Virginia Kay
This paper examines statistical issues associated with estimating paths of juvenile salmon through the intakes of Kaplan turbines. Passive sensors, hydrophones, detecting signals from ultrasonic transmitters implanted in individual fish released into the preturbine region were used to obtain the information to estimate fish paths through the intake. Aim and location of the sensors affects the spatial region in which the transmitters can be detected, and formulas relating this region to sensor aiming directions are derived. Cramer-Rao lower bounds for the variance of estimators of fish location are used to optimize placement of each sensor. Finally, a statistical methodology is developed for analyzing angular data collected from optimally placed sensors.
Krysa, Zbigniew; Pactwa, Katarzyna; Wozniak, Justyna; Dudek, Michal
2017-12-01
Geological variability is one of the main factors that has an influence on the viability of mining investment projects and on the technical risk of geology projects. In the current scenario, analyses of economic viability of new extraction fields have been performed for the KGHM Polska Miedź S.A. underground copper mine at Fore Sudetic Monocline with the assumption of constant averaged content of useful elements. Research presented in this article is aimed at verifying the value of production from copper and silver ore for the same economic background with the use of variable cash flows resulting from the local variability of useful elements. Furthermore, the ore economic model is investigated for a significant difference in model value estimated with the use of linear correlation between useful elements content and the height of mine face, and the approach in which model parameters correlation is based upon the copula best matched information capacity criterion. The use of copula allows the simulation to take into account the multi variable dependencies at the same time, thereby giving a better reflection of the dependency structure, which linear correlation does not take into account. Calculation results of the economic model used for deposit value estimation indicate that the correlation between copper and silver estimated with the use of copula generates higher variation of possible project value, as compared to modelling correlation based upon linear correlation. Average deposit value remains unchanged.
Ghisletta, Paolo; Spini, Dario
2004-01-01
Correlated data are very common in the social sciences. Most common applications include longitudinal and hierarchically organized (or clustered) data. Generalized estimating equations (GEE) are a convenient and general approach to the analysis of several kinds of correlated data. The main advantage of GEE resides in the unbiased estimation of…
Operationalization of biopsychosocial case complexity in general health care : the INTERMED project
de Jonge, P; Huyse, FJ; Slaets, JPJ; Sollner, W; Stiefel, FC
Objective: Lack of operationalization of the biopsychosocial model hinders its effective application to the increasingly prevalent problems of comorbidities in clinical presentations. Here, we describe the INTERMED, an instrument to assess biopsychosocial case complexity in general health care, and
Carbon Accounting and Cost Estimation in Forestry Projects Using CO2Fix V.3
International Nuclear Information System (INIS)
Groen, T.; Nabuurs, G.J.; Schelhaas, M.J.
2006-01-01
Carbon and financial accounting of projects in the Land Use, Land-Use Change and Forestry sector is a topic of hot debate. Large uncertainty remains concerning the carbon dynamics, the way they should be accounted and the cost efficiency of the projects. Part of the uncertainty can be alleviated by standardisation and transparency of reporting methods. For this reason we further developed CO2FIX, a forest ecosystem carbon model, with modules for carbon and financial accounting. The model is applied to four cases: (1) Joint implementation afforestation project in Romania, (2) Forest management project in Central Europe, (3) Reduced impact logging possibly under the Clean Development Mechanism (CDM) in the future, and (4) Afforestation with native species under the Clean Development Mechanism. The results show the wide applicability of CO2FIX, from degrading grasslands as baseline cases to multiple cohort forest ecosystems. Also the results show that Forest Management in the European case can generate considerable amounts of carbon emission reductions. Further, the results show that although reduced impact logging is not yet an allowed option under the Clean Development Mechanism, it shows promising results in that it is (1) very cost effective, (2) seems to be able to generate intermediate amounts of credits and (3) seems to us as a project type that is not prone to leakage issues. These results are yet another indication to seriously consider reduced impact logging as an eligible measure under the CDM
Energy Technology Data Exchange (ETDEWEB)
Dettmers, Dana Lee; Eide, Steven Arvid
2002-10-01
An analysis of completed decommissioning projects is used to construct predictive estimates for worker exposure to radioactivity during decommissioning activities. The preferred organizational method for the completed decommissioning project data is to divide the data by type of facility, whether decommissioning was performed on part of the facility or the complete facility, and the level of radiation within the facility prior to decommissioning (low, medium, or high). Additional data analysis shows that there is not a downward trend in worker exposure data over time. Also, the use of a standard estimate for worker exposure to radioactivity may be a best estimate for low complete storage, high partial storage, and medium reactor facilities; a conservative estimate for some low level of facility radiation facilities (reactor complete, research complete, pits/ponds, other), medium partial process facilities, and high complete research facilities; and an underestimate for the remaining facilities. Limited data are available to compare different decommissioning alternatives, so the available data are reported and no conclusions can been drawn. It is recommended that all DOE sites and the NRC use a similar method to document worker hours, worker exposure to radiation (person-rem), and standard industrial accidents, injuries, and deaths for all completed decommissioning activities.
International Nuclear Information System (INIS)
Piesch, E.; Boehm, J.; Heinzelmann, M.
1983-01-01
In radiation protection monitoring the need exists for an estimation of body dose due to external #betta#-rays, for instance if the #betta#-dose rate at the working area is expected to be high according to the data of source activity or room contamination, the indicated dose values of a personal dosemeter exceed the operational limit for the organ or tissue depth of interest, or a person was exposed to a significant dose. On behalf of the Federal Ministry of the Interior, Federal Republic of Germany, a guideline is now under preparation which offers a standardized concept of the estimation of #betta#-doses in personnel monitoring. The calculation models discussed here will be used as a basis for any case of external #betta#-irradiation where, in connection with the German Radiation Protection Ordinance, the ICRP dose equivalent limits are reached or the dosemeter readings are not representative for an individual exposure. The generalized concept discussed in the paper relates to: the calculation of #betta#-dose on the basis of source activity or spectral particle fluence and takes into account the special cases of point sources, area sources and volume sources; the estimation of body dose on the basis of calculated data or measured results from area or personnel monitoring, taking into account the dose equivalent in different depths of tissue, in particular the dose equivalent to the skin, the lens of the eye and other organs; and finally the estimation of skin dose due to the contamination of the skin surface. Basic reference data are presented in order to estimate the dose equivalent of interest which varies significantly in the #betta#-radiation field as a function of the maximum #betta#-energy, distance to the source, size of the source, activity per area for surface contamination and activity per volume for air contamination
General overview of the AxialT project: A partnership for low head turbine developments
International Nuclear Information System (INIS)
Deschenes, C; Ciocan, G D; Henau, V De; Flemming, F; Qian, R; Huang, J; Koller, M; Vu, T; Naime, F A; Page, M
2010-01-01
An overview of the AxialT project is presented. Initiated in 2007 by the Consortium on Hydraulic Machines, the aim of this four years project is to contribute to the study of time-dependent hydraulic phenomena in a propeller turbine. The geometry of the entire turbine is generously shared by all partners. Numerical simulations carried out by all partners are confronted with experimental measurements carried out at the LAMH laboratory in Laval University. A mix of 2D LDA, 3D PIV and unsteady pressure measurements are adapted to yield precise measurements at eight strategic locations within the turbine and for nine operating points. Phase resolved analysis is performed wherever applicable. An illustration of potential analysis accessible with the database is shown for the identification of a vortex in the runner at part load.
Energy Technology Data Exchange (ETDEWEB)
Grattidge, W.; Westbrook, J.; McCarthy, J.; Northrup, C. Jr.; Rumble, J. Jr.
1986-11-01
The National Bureau of Standards and the Department of Energy have embarked on a program to build a demonstration computerized materials data system called Materials Information for Science and Technology (MIST). This report documents the first two phases of the project. The emphasis of the first phase was on determining what information was needed and how it could impact user productivity. The second phase data from the Aerospace Metal Handbook on a set of alloys was digitized and incorporated in the system.
Pandit, J J; Andrade, J; Bogod, D G; Hitchman, J M; Jonker, W R; Lucas, N; Mackay, J H; Nimmo, A F; O'Connor, K; O'Sullivan, E P; Paul, R G; Palmer, J H M G; Plaat, F; Radcliffe, J J; Sury, M R J; Torevell, H E; Wang, M; Hainsworth, J; Cook, T M
2014-10-01
We present the main findings of the 5th National Audit Project (NAP5) on accidental awareness during general anaesthesia (AAGA). Incidences were estimated using reports of accidental awareness as the numerator, and a parallel national anaesthetic activity survey to provide denominator data. The incidence of certain/probable and possible accidental awareness cases was ~1:19,600 anaesthetics (95% confidence interval 1:16,700-23,450). However, there was considerable variation across subtypes of techniques or subspecialities. The incidence with neuromuscular block (NMB) was ~1:8200 (1:7030-9700), and without, it was ~1:135,900 (1:78,600-299,000). The cases of AAGA reported to NAP5 were overwhelmingly cases of unintended awareness during NMB. The incidence of accidental awareness during Caesarean section was ~1:670 (1:380-1300). Two-thirds (82, 66%) of cases of accidental awareness experiences arose in the dynamic phases of anaesthesia, namely induction of and emergence from anaesthesia. During induction of anaesthesia, contributory factors included: use of thiopental, rapid sequence induction, obesity, difficult airway management, NMB, and interruptions of anaesthetic delivery during movement from anaesthetic room to theatre. During emergence from anaesthesia, residual paralysis was perceived by patients as accidental awareness, and commonly related to a failure to ensure full return of motor capacity. One-third (43, 33%) of accidental awareness events arose during the maintenance phase of anaesthesia, mostly due to problems at induction or towards the end of anaesthesia. Factors increasing the risk of accidental awareness included: female sex, age (younger adults, but not children), obesity, anaesthetist seniority (junior trainees), previous awareness, out-of-hours operating, emergencies, type of surgery (obstetric, cardiac, thoracic), and use of NMB. The following factors were not risk factors for accidental awareness: ASA physical status, race, and use or omission
Cross-national comparability of burden of disease estimates: the European Disability Weights Project
Essink-Bot, Marie-Louise; Pereira, Joaquin; Packer, Claire; Schwarzinger, Michael; Burstrom, Kristina
2002-01-01
OBJECTIVE: To investigate the sources of cross-national variation in disability-adjusted life-years (DALYs) in the European Disability Weights Project. METHODS: Disability weights for 15 disease stages were derived empirically in five countries by means of a standardized procedure and the
An approach to estimation of degree of customization for ERP projects using prioritized requirements
Parthasarathy, Sudhaman; Daneva, Maia
2016-01-01
Customization in ERP projects is a risky, but unavoidable undertaking that companies need to initiate in order to achieve alignment between their acquired ERP solution and their organizational goals and business processes. Conscious about the risks, many companies commit to leveraging the
Svansdottir, Erla; Denollet, Johan; Thorsson, Bolli; Gudnason, Thorarinn; Halldorsdottir, Sigrun; Gudnason, Vilmundur; van den Broek, Krista C; Karlsson, Hrobjartur D
2013-04-01
Type D personality is associated with an increased morbidity and mortality risk in cardiovascular disease patients, but the mechanisms explaining this risk are unclear. We examined whether Type D was associated with coronary artery disease (CAD) risk factors, estimated risk of developing CAD, and previous cardiac events. Cross-sectional study in the general Icelandic population. A random sample of 4753 individuals (mean age 49.1 ± 12.0 years; 49% men) from the REFINE-Reykjavik study completed assessments for Type D personality and conventional CAD risk factors. Ten-year risk of developing CAD was estimated with the Icelandic risk calculator. Type D personality (22% of sample) was associated with a higher prevalence of hypertension (35 vs. 31%, p = 0.009), but less use of hypertension medication (58 vs. 65%, p = 0.013) in hypertensives, more diabetes (6 vs. 4%, p = 0.023), wider waist circumference (p = 0.007), and elevated body mass index (p = 0.025) and blood lipids (p lifestyle-related CAD risk factors, a higher estimated risk of developing CAD, and higher incidence of previous cardiac events. Unhealthy lifestyles may partly explain the adverse cardiovascular effect of Type D personality.
A Generalized SOC-OCV Model for Lithium-Ion Batteries and the SOC Estimation for LNMCO Battery
Directory of Open Access Journals (Sweden)
Caiping Zhang
2016-11-01
Full Text Available A state-of-charge (SOC versus open-circuit-voltage (OCV model developed for batteries should preferably be simple, especially for real-time SOC estimation. It should also be capable of representing different types of lithium-ion batteries (LIBs, regardless of temperature change and battery degradation. It must therefore be generic, robust and adaptive, in addition to being accurate. These challenges have now been addressed by proposing a generalized SOC-OCV model for representing a few most widely used LIBs. The model is developed from analyzing electrochemical processes of the LIBs, before arriving at the sum of a logarithmic, a linear and an exponential function with six parameters. Values for these parameters are determined by a nonlinear estimation algorithm, which progressively shows that only four parameters need to be updated in real time. The remaining two parameters can be kept constant, regardless of temperature change and aging. Fitting errors demonstrated with different types of LIBs have been found to be within 0.5%. The proposed model is thus accurate, and can be flexibly applied to different LIBs, as verified by hardware-in-the-loop simulation designed for real-time SOC estimation.
Directory of Open Access Journals (Sweden)
Somayyeh Lotfi Noghabi
2012-07-01
Full Text Available Introduction: Epilepsy is a clinical syndrome in which seizures have a tendency to recur. Sodium valproate is the most effective drug in the treatment of all types of generalized seizures. Finding the optimal dosage (the lowest effective dose of sodium valproate is a real challenge to all neurologists. In this study, a new approach based on Adaptive Neuro-Fuzzy Inference System (ANFIS was presented for estimating the optimal dosage of sodium valproate in IGE (Idiopathic Generalized Epilepsy patients. Methods: 40 patients with Idiopathic Generalized Epilepsy, who were referred to the neurology department of Mashhad University of Medical Sciences between the years 2006-2011, were included in this study. The function Adaptive Neuro- Fuzzy Inference System (ANFIS constructs a Fuzzy Inference System (FIS whose membership function parameters are tuned (adjusted using either a back-propagation algorithm alone, or in combination with the least squares type of method (hybrid algorithm. In this study, we used hybrid method for adjusting the parameters. Methods: The R-square of the proposed system was %598 and the Pearson correlation coefficient was significant (P 0.05. Although the accuracy of the model was not high, it wasgood enough to be applied for treating the IGE patients with sodium valproate. Discussion: This paper presented a new application of ANFIS for estimating the optimal dosage of sodium valproate in IGE patients. Fuzzy set theory plays an important role in dealing with uncertainty when making decisions in medical applications. Collectively, it seems that ANFIS has a high capacity to be applied in medical sciences, especially neurology.
Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates
John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin
2014-01-01
Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...
Uncertainty in estimated values of forestry project: a case study of ...
African Journals Online (AJOL)
The information obtained were analyzed using Net Present Value, Benefit-Cost Ratio, Economic Rate of Return and Sensitivity Analysis. The results of this study indicate that the NPV and B/C ratio were sensitive to increase in discount factor. The values of estimates for a direct and taungya plantatiomn at Ago-Owu forest ...
Stroke risk estimation across nine European countries in the MORGAM project
DEFF Research Database (Denmark)
Borglykke, Anders; Andreasen, Anne H; Kuulasmaa, Kari
2010-01-01
Previous tools for stroke risk assessment have either been developed for specific populations or lack data on non-fatal events or uniform data collection. The purpose of this study was to develop a stepwise model for the estimation of 10 year risk of stroke in nine different countries across Europe....
Heikkilä, M.; Solaimani, H. (Sam); Kuivaniemi, L.; Suoranta, M.
2014-01-01
Purpose: The objective of this paper is to propose and demonstrate a framework for estimating performance in a networked business model. Design/methodology/approach: Our approach is design science, utilising action research in studying a case of four independent firms in Health & Wellbeing sector
Estimating the Distance to the Moon--Its Relevance to Mathematics. Core-Plus Mathematics Project.
Stern, David P.
This document features an activity for estimating the distance from the earth to the moon during a solar eclipse based on calculations performed by the ancient Greek astronomer Hipparchus. Historical, mathematical, and scientific details about the calculation are provided. Internet resources for teachers to obtain more information on the subject…
A number-projected model with generalized pairing interaction in application to rotating nuclei
Energy Technology Data Exchange (ETDEWEB)
Satula, W. [Warsaw Univ. (Poland)]|[Joint Institute for Heavy Ion Research, Oak Ridge, TN (United States)]|[Univ. of Tennessee, Knoxville, TN (United States)]|[Royal Institute of Technology, Stockholm (Sweden); Wyss, R. [Royal Institute of Technology, Stockholm (Sweden)
1996-12-31
A cranked mean-field model that takes into account both T=1 and T=0 pairing interactions is presented. The like-particle pairing interaction is described by means of a standard seniority force. The neutron-proton channel includes simultaneously correlations among particles moving in time reversed orbits (T=1) and identical orbits (T=0). The coupling between different pairing channels and nuclear rotation is taken into account selfconsistently. Approximate number-projection is included by means of the Lipkin-Nogami method. The transitions between different pairing phases are discussed as a function of neutron/proton excess, T{sub z}, and rotational frequency, {Dirac_h}{omega}.
International Nuclear Information System (INIS)
2010-01-01
This document presents the scope of the order project which defines the main requirements applicable to INBs (base nuclear installations) in terms of protection of people and of the environment in front of risks of accident, of pollutions and other nuisances. More precisely, the document explains the scope of the several specific aspects addressed by this order: safety policy and management, accident risk management, management of nuisance and of the installation impact on population and on the environment, management and elimination of wastes and fuels spent by a base nuclear installation, management of emergency situations, population information, authorization request procedures, and other provisions
Liu, Jingxia; Colditz, Graham A
2018-05-01
There is growing interest in conducting cluster randomized trials (CRTs). For simplicity in sample size calculation, the cluster sizes are assumed to be identical across all clusters. However, equal cluster sizes are not guaranteed in practice. Therefore, the relative efficiency (RE) of unequal versus equal cluster sizes has been investigated when testing the treatment effect. One of the most important approaches to analyze a set of correlated data is the generalized estimating equation (GEE) proposed by Liang and Zeger, in which the "working correlation structure" is introduced and the association pattern depends on a vector of association parameters denoted by ρ. In this paper, we utilize GEE models to test the treatment effect in a two-group comparison for continuous, binary, or count data in CRTs. The variances of the estimator of the treatment effect are derived for the different types of outcome. RE is defined as the ratio of variance of the estimator of the treatment effect for equal to unequal cluster sizes. We discuss a commonly used structure in CRTs-exchangeable, and derive the simpler formula of RE with continuous, binary, and count outcomes. Finally, REs are investigated for several scenarios of cluster size distributions through simulation studies. We propose an adjusted sample size due to efficiency loss. Additionally, we also propose an optimal sample size estimation based on the GEE models under a fixed budget for known and unknown association parameter (ρ) in the working correlation structure within the cluster. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Razieh Khajeh-Kazemi
2011-01-01
Full Text Available Background: The celebrated generalized estimating equations (GEE approach is often used in longitudinal data analysis While this method behaves robustly against misspecification of the working correlation structure, it has some limitations on efficacy of estimators, goodness-of-fit tests and model selection criteria The quadratic inference functions (QIF is a new statistical methodology that overcomes these limitations Methods : We administered the use of QIF and GEE in comparing the superior and inferior Ahmed glaucoma valve (AGV implantation, while our focus was on the efficiency of estimation and using model selection criteria, we compared the effect of implant location on intraocular pressure (IOP in refractory glaucoma patients We modeled the relationship between IOP and implant location, patient′s sex and age, best corrected visual acuity, history of cataract surgery, preoperative IOP and months after surgery with assuming unstructured working correlation Results : 63 eyes of 63 patients were included in this study, 28 eyes in inferior group and 35 eyes in superior group The GEE analysis revealed that preoperative IOP has a significant effect on IOP (p = 0 011 However, QIF showed that preoperative IOP, months after surgery and squared months are significantly associated with IOP after surgery (p < 0 05 Overall, estimates from QIF are more efficient than GEE (RE = 1 272 Conclusions : In the case of unstructured working correlation, the QIF is more efficient than GEE There were no considerable difference between these locations, our results confirmed previously published works which mentioned it is better that glaucoma patients undergo superior AGV implantation
Izett, Jonathan G.; Fennel, Katja
2018-02-01
Rivers deliver large amounts of terrestrially derived materials (such as nutrients, sediments, and pollutants) to the coastal ocean, but a global quantification of the fate of this delivery is lacking. Nutrients can accumulate on shelves, potentially driving high levels of primary production with negative consequences like hypoxia, or be exported across the shelf to the open ocean where impacts are minimized. Global biogeochemical models cannot resolve the relatively small-scale processes governing river plume dynamics and cross-shelf export; instead, river inputs are often parameterized assuming an "all or nothing" approach. Recently, Sharples et al. (2017), https://doi.org/10.1002/2016GB005483 proposed the SP number—a dimensionless number relating the estimated size of a plume as a function of latitude to the local shelf width—as a simple estimator of cross-shelf export. We extend their work, which is solely based on theoretical and empirical scaling arguments, and address some of its limitations using a numerical model of an idealized river plume. In a large number of simulations, we test whether the SP number can accurately describe export in unforced cases and with tidal and wind forcings imposed. Our numerical experiments confirm that the SP number can be used to estimate export and enable refinement of the quantitative relationships proposed by Sharples et al. We show that, in general, external forcing has only a weak influence compared to latitude and derive empirical relationships from the results of the numerical experiments that can be used to estimate riverine freshwater export to the open ocean.
Hernandez, D. W.
2012-12-01
The CDRP is a major construction project involving up to 400 workers using heavy earth moving equipment, blasting, drilling, rock crushing, and other techniques designed to move 7 million yards of earth. Much of this material is composed of serpentinite, blueschist, and other rocks that contain chrysotile, crocidolite, actinolite, tremolite, and Libby-class amphiboles. To date, over 1,000 personal, work area, and emission inventory related samples have been collected and analyzed by NIOSH 7400, NIOSH 7402, and CARB-AHERA methodology. Data indicate that various CDRP construction activities have the potential to generate significant mineral fibers and structures that could represent elevated on site and off site health risks. This presentation will review the Contractors air monitoring program for this major project, followed by a discussion of predictive methods to evaluate potential onsite and offsite risks. Ultimately, the data are used for planning control strategies designed to achieve a Project Action Level of 0.01 f/cc (one tenth the Cal/OSHA PEL) and risk-based offsite target levels.
Energy Technology Data Exchange (ETDEWEB)
Grattidge, W.; Westbrook, J.; McCarthy, J.; Northrup, C. Jr.; Rumble, J. Jr.
1986-01-01
This report documents the initial phases of the Materials Information for Science and Technology (MIST) project jointly supported by the Department of Energy and the National Bureau of Standards. The purpose of MIST is to demonstrate the power and utility of computer access to materials property data. The initial goals include: to exercise the concept of a computer network of materials databases and to build a demonstration of such a system suitable for use as the core of operational systems in the future. Phases I and II are described in detail herein. In addition, a discussion is given of the expected usage of the system. The primary MIST prototype project is running on an IBM 3084 under STS at the Stanford University's Information Technology Services (ITS). Users can access the Stanford system via ARPANET, TELENET, and TYMNET, as well as via commercial telephone lines. For fastest response time and use of the full screen PRISM interface, direct connection using a 2400 baud modem with the MNP error-correcting protocol over standard telephone lines gives the best results - though slower speed connections and a line-oriented interface are also available. This report gives detailed plans regarding the properties to be enterend and the materials to be entered into the system.
Guo, J; Booth, M; Jenkins, J; Wang, H; Tanner, M
1998-12-01
The World Bank Loan Project for schistosomiasis in China commenced field activities in 1992. In this paper, we describe disease control strategies for levels of different endemicity, and estimate unit costs and total expenditure of screening, treatment (cattle and humans) and snail control for 8 provinces where Schistosoma japonicum infection is endemic. Overall, we estimate that more than 21 million US dollars were spent on field activities during the first three years of the project. Mollusciciding (43% of the total expenditure) and screening (28% of the total) are estimated to have the most expensive field activities. However, despite the expense of screening, a simple model predicts that selective chemotherapy could have been cheaper than mass chemotherapy in areas where infection prevalence was higher than 15%, which was the threshold for mass chemotherapy intervention. It is concluded that considerable cost savings could be made in the future by narrowing the scope of snail control activities, redefining the threshold infection prevalence for mass chemotherapy, defining smaller administrative units, and developing rapid assessment tools.
Wenzel, Thomas J
2006-01-01
The laboratory component of a first-semester general chemistry course for science majors is described. The laboratory involves a semester-long project undertaken in a small-group format. Students are asked to examine whether plants grown in soil contaminated with lead take up more lead than those grown in uncontaminated soil. They are also asked to examine whether the acidity of the rainwater affects the amount of lead taken up by the plants. Groups are then given considerable independence in the design and implementation of the experiment. Once the seeds are planted, which takes about 4 wk into the term, several shorter experiments are integrated in before it is time to harvest and analyze the plants. The use of a project and small working groups allows for the development of a broader range of learning outcomes than occurs in a "traditional" general chemistry laboratory. The nature of these outcomes and some of the student responses to the laboratory experience are described. This particular project also works well at demonstrating the connections among chemistry, biology, geology, and environmental studies.
Energy Technology Data Exchange (ETDEWEB)
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-03-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-07-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
International Nuclear Information System (INIS)
Lin, M.; Brechtel, C.E.; Hardy, M.P.; Bauer, S.J.
1992-01-01
This paper presents a method of estimating the rock mass properties for the welded and nonwelded tuffs based on currently available information on intact rock and joint characteristics at the Yucca Mountain site. Variability of the expected ground conditions at the potential repository horizon (the TSw2 thermomechanical unit) and in the Calico Hills nonwelded tuffs is accommodated by defining five rock mass quality categories in each unit based upon assumed and observed distributions of the data
Estimation and Projection of Lung Cancer Incidence and Mortality in China
Directory of Open Access Journals (Sweden)
Xiaonong ZOU
2010-05-01
Full Text Available Background and objective The aim of this study is to analyze lung cancer epidemiological trend and estimate lung cancer burden in China. Methods Lung cancer age specific mortality and incidence rate ratios in different areas and sexes were obtained from national cancer registration database in 2004 and 2005. Cancer crude mortalities were retrieved from the database of the third national death survey, 2004-2005. Age specific incidence rates of lung cancer were calculated using mortality and M/I ratios. Annual percent change (APC was estimated by log regression model using Joint Point software by analyzing pooled lung cancer incidence data from 10 cancer registries from 1988 to 2005. Results The total estimated new cases and deaths of lung cancer in 2005 were 536 407 and 475 768 which were higher in male than in female. There was 1.63% increase of lung cancer incidence per year from 1988 to 2005, however, the trend showed a slowdown by 0.55% annually after adjusted by age. Conclusion Lung cancer is one of major health issues in China and the burden is getting serious. Ageing population is main cause for increasing incidence and mortality of lung cancer. Effective cancer prevention and control is imperative. Especially, tobacco control should be carried out in statewide.
International Nuclear Information System (INIS)
Etkin, D.S.; Grey, C.; Wells, P.; Koefoed, J.; Nauke, M.; Meyer, T.; Campbell, J.; Reddy, S.
1998-01-01
A meeting of the Joint Group of Experts on the Scientific Aspects of Marine Protection (GESAMP), Working Group 32, was held to discuss a new approach for evaluating available data sources on the input of oil into the marine environment from sea-based activities. GESAMP Working Group on Estimates of Oil Entering the Marine Environment, Sea Based Activities (Working Group 32) will collect and analyze data on oil inputs over the last decade from shipping, offshore and coastal exploration and production, pipelines, atmospheric emissions from sea-based activities, coastal refineries and storage facilities, oil reception facilities, materials disposed of at sea, and natural seepage. The group will compare its oil input estimate model to estimates made by the National Research Council, the International Maritime Organization (IMO), and GESAMP in previous decades, to evaluate the efficacy of IMO conventions and other pollution reduction efforts in the last 10 years. The group will also consider the amounts of oil entering the sea through operational and accidental spillage in relation to the quantities of oil transported by ship and through pipelines, and in relation to offshore and coastal oil production. 7 refs., 4 tabs
Farnsworth, K. L.; House, M.; Hovan, S. A.
2013-12-01
A recent workshop sponsored by SERC-On the Cutting Edge brought together science educators from a range of schools across the country to discuss new approaches in teaching oceanography. In discussing student interest in our classes, we were struck by the fact that students are drawn to emotional or controversial topics such as whale hunting and tsunami hazard and that these kinds of topics are a great vehicle for introducing more complex concepts such as wave propagation, ocean upwelling and marine chemistry. Thus, we have developed an approach to introductory oceanography that presents students with real-world issues in the ocean sciences and requires them to explore the science behind them in order to improve overall ocean science literacy among non-majors and majors at 2 and 4 year colleges. We have designed a project-based curriculum built around topics that include, but are not limited to: tsunami hazard, whale migration, ocean fertilization, ocean territorial claims, rapid climate change, the pacific trash patch, overfishing, and ocean acidification. Each case study or project consists of three weeks of class time and is structured around three elements: 1) a media analysis; 2) the role of ocean science in addressing the issue; 3) human impact/response. Content resources range from textbook readings, popular or current print news, documentary film and television, and data available on the world wide web from a range of sources. We employ a variety of formative assessments for each case study in order to monitor student access and understanding of content and include a significant component of in-class student discussion and brainstorming guided by faculty input to develop the case study. Each study culminates in summative assessments ranging from exams to student posters to presentations, depending on the class size and environment. We envision this approach for a range of classroom environments including large group face-to-face instruction as well as hybrid
International Nuclear Information System (INIS)
Pinilla Agudelo, Gabriel A; Rodriguez Sandoval, Erasmo A; Camacho Botero, Luis A
2014-01-01
A methodological proposal for estimating environmental flows in large projects approved by Agencia Nacional de Licencias Ambientales (ANLA) in Colombian rivers was developed. The project is the result of an agreement between the MADS and the Universidad Nacional de Colombia, Bogota (UNC). The proposed method begins with an evaluation of hydrological criteria, continues with a hydraulic and water quality validation, and follows with the determination of habitat integrity. This is an iterative process that compares conditions before and after the project construction and allows to obtain the magnitude of a monthly flow that, besides preserving the ecological functions of the river, guarantees the water uses downstream. Regarding to the biotic component, the proposal includes the establishment and monitoring of biotic integrity indices for four aquatic communities (periphyton, macro invertebrates, riparian vegetation, and fish). The effects that flow reduction may produce in the medium and long term can be assessed by these indices. We present the results of applying the methodology to several projects licensed by the MADS.
International Nuclear Information System (INIS)
Lee, S.H.; Moon, B.S.; Lee, J.H.
2014-01-01
The Earned Value Management System (EVMS) is a project management technique for measuring project performance and progress, and then forward projection through the integrated management and control of cost and schedule. This research reviewed the concept of the EVMS method, and proposes two Planned Value estimation methods for the potential application to succeeding NPP construction projects by using the historical data from the proceeding NPP projects. This paper is to introduce the solution for the problems caused by the absence of relevant management system incorporating schedule and cost, which has arisen as repeated issues in NPP construction project management. (author)
Energy Technology Data Exchange (ETDEWEB)
Lee, S.H.; Moon, B.S., E-mail: gustblast@khnp.co.kr, E-mail: moonbs@khnp.co.kr [Korea Hydro & Nuclear power co.,Ltd., Central Research Inst., Daejeon (Korea, Republic of); Lee, J.H., E-mail: ljh@kkprotech.com [Kong Kwan Protech Co.,Ltd., Seoul (Korea, Republic of)
2014-07-01
The Earned Value Management System (EVMS) is a project management technique for measuring project performance and progress, and then forward projection through the integrated management and control of cost and schedule. This research reviewed the concept of the EVMS method, and proposes two Planned Value estimation methods for the potential application to succeeding NPP construction projects by using the historical data from the proceeding NPP projects. This paper is to introduce the solution for the problems caused by the absence of relevant management system incorporating schedule and cost, which has arisen as repeated issues in NPP construction project management. (author)
International Nuclear Information System (INIS)
Schaller, A.; Lokner, V.; Subasic, D.
2003-01-01
The expenses needed for development of low- and intermediate level radioactive waste (LILW) repository project in Croatia include: (a) preliminary activities, (b) preparatory activities, and (c) preparing of environmental impact study. The first group of expenses are referring to the project leading activities, project plan updating, build-up of required infrastructure, preparing of licensing documentation, site investigations, data acquisition programme, pre-operational radio-ecological monitoring, modelling, safety analysis (first iteration) and public related activities. Preparatory activities are referring to purchasing of land for repository and preparatory activities for carrying out of on-site investigations, while third group of expenses are related to preparation and validation of Environmental impact study. It was found out that about 50 % of total expenses refer to build-up of infrastructure. Additional 25 % finances are related to radio-ecological monitoring, site investigations and development of calculations and models, while remaining 25 % of total estimated sum is expected to be spent for repository safety assessment, public relations, purchasing and preparing the on-site terrain for construction, etc. It was calculated 607 EUR per m3 of LILW to be needed up to site license acquisition. According to the world-wide practice, by extrapolating of additional expenses necessary for construction of the repository and acquisition of operational license, it comes out the cost of 1.723 EUR per m3 of LILW for shallow-ground and 2.412 EUR per m3 of LILW for tunnel repository. The estimated expenses for Croatia are within the span of expenses for the same purpose in the countries considered. Expected duration of the project performance up to acquisition of the site license is 4 years and 3 months. (author)
Simon, Patrick; Schneider, Peter
2017-08-01
In weak gravitational lensing, weighted quadrupole moments of the brightness profile in galaxy images are a common way to estimate gravitational shear. We have employed general adaptive moments (GLAM ) to study causes of shear bias on a fundamental level and for a practical definition of an image ellipticity. The GLAM ellipticity has useful properties for any chosen weight profile: the weighted ellipticity is identical to that of isophotes of elliptical images, and in absence of noise and pixellation it is always an unbiased estimator of reduced shear. We show that moment-based techniques, adaptive or unweighted, are similar to a model-based approach in the sense that they can be seen as imperfect fit of an elliptical profile to the image. Due to residuals in the fit, moment-based estimates of ellipticities are prone to underfitting bias when inferred from observed images. The estimation is fundamentally limited mainly by pixellation which destroys information on the original, pre-seeing image. We give an optimised estimator for the pre-seeing GLAM ellipticity and quantify its bias for noise-free images. To deal with images where pixel noise is prominent, we consider a Bayesian approach to infer GLAM ellipticity where, similar to the noise-free case, the ellipticity posterior can be inconsistent with the true ellipticity if we do not properly account for our ignorance about fit residuals. This underfitting bias, quantified in the paper, does not vary with the overall noise level but changes with the pre-seeing brightness profile and the correlation or heterogeneity of pixel noise over the image. Furthermore, when inferring a constant ellipticity or, more relevantly, constant shear from a source sample with a distribution of intrinsic properties (sizes, centroid positions, intrinsic shapes), an additional, now noise-dependent bias arises towards low signal-to-noise if incorrect prior densities for the intrinsic properties are used. We discuss the origin of this
International Nuclear Information System (INIS)
Palmer, P.A.
1995-01-01
The Lasagna project is the first of what we expect will be several large cooperative projects between industry consortia and government to develop improved remediation technologies. In 1992, Monsanto Company began contacting other major corporations to see if they were experiencing similar difficulties in applying cost-effective, or even workable technologies for industrial site remediation. Both General Electric and DuPont were early participants in the effort to develop a meeting with the EPA to discuss technical problems faced in cleanup, research needs, and ways to accelerate development of more cost-effective techniques. This paper provides some background on how this cooperative process came to reality, what the Lasagna process is and how the cooperative arrangements and financing are structured
Prague’s Sewerage System in the 1930’ and the General Sewerage Project (1933–1936
Directory of Open Access Journals (Sweden)
K. Drnek
2010-01-01
Full Text Available Prague’s sewerage system was built at the end of the era of the monarchy in the united town that Prague was transformed into. The system was soon overloaded, and was not able to remove all the sewage produced by the citizens.To deal with this hygienic threat, the city council and the management of the wastewater services undertook several actions to build a new system or improve the existing system. The most ambitious and extensive measure was the general project carried out between 1933 and 1936.The project was invented to resolve the problem once and for all by introducing new ideas and cut out the problem of placing a new sewage plant instead of the old one. For the present-day observer it also offers a range of spectacular and interesting ideas on urban wastewater treatment.
SECON - A tool for estimation of storage costs and storage project revenue
International Nuclear Information System (INIS)
Hall, O.
1997-01-01
The SECON model Storage ECONomics is useful for gas suppliers, storage operators, gas distributors and consumers when investigating new storage possibilities. SECON has been used within the Sydkraft group to compare cost for different types of storage and to identify the market niche for lined rock cavern (LRC) storage. In the model cost for the different storage types, salt caverns, LNG, and LRC can be compared. By using input according to market needs each storage type can be validated for a specific service e.g. peak shaving, seasonal storage or balancing. The project revenue can also be calculated. SECON includes three models for income calculation; US storage service, Trading and Avoided Supply Contract Costs. The income models calculates annual turnover, pay of time, net present value, internal rate of return and max. liquidity shortfall for the project. The SECON will facilitate sensitivity analysis both regarding cost for different services and different storage types and on the income side by using different scenarios. At the poster session SECON will be presented live and the delegates will have the opportunity to test the model. (au)
Directory of Open Access Journals (Sweden)
Kevin V Lemley
Full Text Available Most predictive models of kidney disease progression have not incorporated structural data. If structural variables have been used in models, they have generally been only semi-quantitative.We examined the predictive utility of quantitative structural parameters measured on the digital images of baseline kidney biopsies from the NEPTUNE study of primary proteinuric glomerulopathies. These variables were included in longitudinal statistical models predicting the change in estimated glomerular filtration rate (eGFR over up to 55 months of follow-up.The participants were fifty-six pediatric and adult subjects from the NEPTUNE longitudinal cohort study who had measurements made on their digital biopsy images; 25% were African-American, 70% were male and 39% were children; 25 had focal segmental glomerular sclerosis, 19 had minimal change disease, and 12 had membranous nephropathy. We considered four different sets of candidate predictors, each including four quantitative structural variables (for example, mean glomerular tuft area, cortical density of patent glomeruli and two of the principal components from the correlation matrix of six fractional cortical areas-interstitium, atrophic tubule, intact tubule, blood vessel, sclerotic glomerulus, and patent glomerulus along with 13 potentially confounding demographic and clinical variables (such as race, age, diagnosis, and baseline eGFR, quantitative proteinuria and BMI. We used longitudinal linear models based on these 17 variables to predict the change in eGFR over up to 55 months. All 4 models had a leave-one-out cross-validated R2 of about 62%.Several combinations of quantitative structural variables were significantly and strongly associated with changes in eGFR. The structural variables were generally stronger than any of the confounding variables, other than baseline eGFR. Our findings suggest that quantitative assessment of diagnostic renal biopsies may play a role in estimating the baseline
Bourgain, Pascaline
2015-04-01
Bridging Science and Society has now become a necessity for scientists to develop new partnerships with local communities and to raise the public interest for scientific activities. The French-Greenlandic educational project called "Angalasut" reflects this desire to create a bridge between science, local people and the general public. This program was set up on the 2012-2013 school year, as part of an international scientific program dedicated to study the interactions between the ocean and glaciers on the western coast of Greenland, in the Uummannaq fjord. Greenlandic and French school children were involved in educational activities, in classrooms and out on the field, associated with the scientific observations conducted in Greenland (glacier flow, ocean chemical composition and circulation, instrumentation...). In Greenland, the children had the opportunity to come on board the scientific sailing boat, and in France, several meetings were organized between the children and the scientists of the expedition. In the small village of Ikerasak, the children interviewed Elders about sea ice evolution in the area. These activities, coupled to the organization of public conferences and to the creation of a trilingual website of the project (French, Greenlandic, English) aimed at explaining why scientists come to study Greenland environment. This was the opportunity for scientists to discuss with villagers who could testify on their changing environment over the past decades. A first step toward a future collaboration between scientists and villagers that would deserve further development... The project Angalasut was also the opportunity for Greenlandic and French school children to exchange about their culture and their environment through Skype communications, the exchange of mails (drawings, shells...), the creation of a society game about European fauna and flora... A meeting in France between the two groups of children is considered, possibly in summer 2015
Simonsen, Lone; Spreeuwenberg, Peter; Lustig, Roger; Taylor, Robert J; Fleming, Douglas M; Kroneman, Madelon; Van Kerkhove, Maria D; Mounts, Anthony W; Paget, W John
2013-11-01
Assessing the mortality impact of the 2009 influenza A H1N1 virus (H1N1pdm09) is essential for optimizing public health responses to future pandemics. The World Health Organization reported 18,631 laboratory-confirmed pandemic deaths, but the total pandemic mortality burden was substantially higher. We estimated the 2009 pandemic mortality burden through statistical modeling of mortality data from multiple countries. We obtained weekly virology and underlying cause-of-death mortality time series for 2005-2009 for 20 countries covering ∼35% of the world population. We applied a multivariate linear regression model to estimate pandemic respiratory mortality in each collaborating country. We then used these results plus ten country indicators in a multiple imputation model to project the mortality burden in all world countries. Between 123,000 and 203,000 pandemic respiratory deaths were estimated globally for the last 9 mo of 2009. The majority (62%-85%) were attributed to persons under 65 y of age. We observed a striking regional heterogeneity, with almost 20-fold higher mortality in some countries in the Americas than in Europe. The model attributed 148,000-249,000 respiratory deaths to influenza in an average pre-pandemic season, with only 19% in persons representation of low-income countries among single-country estimates and an inability to study subsequent pandemic waves (2010-2012). We estimate that 2009 global pandemic respiratory mortality was ∼10-fold higher than the World Health Organization's laboratory-confirmed mortality count. Although the pandemic mortality estimate was similar in magnitude to that of seasonal influenza, a marked shift toward mortality among persons Europe. A collaborative network to collect and analyze mortality and hospitalization surveillance data is needed to rapidly establish the severity of future pandemics. Please see later in the article for the Editors' Summary.
International Nuclear Information System (INIS)
Warren, R.N.
1998-01-01
In 1997, the SNFP developed a baseline change request (BCR) and submitted it to DOE-RL for approval. The schedule was formally evaluated to have a 19% probability of success [Williams, 1998]. In December 1997, DOE-RL Manager John Wagoner approved the BCR contingent upon a subsequent independent review of the new baseline. The SNFP took several actions during the first quarter of 1998 to prepare for the independent review. The project developed the Estimating Requirements and Implementation Guide [DESH, 1998] and trained cost account managers (CAMS) and other personnel involved in the estimating process in activity-based cost (ABC) estimating techniques. The SNFP then applied ABC estimating techniques to develop the basis for the December Baseline (DB) and documented that basis in Basis of Estimate (BOE) books. These BOEs were provided to DOE in April 1998. DOE commissioned Professional Analysis, Inc. (PAI) to perform a critical analysis (CA) of the DB. PAI's review formally began on April 13. PAI performed the CA, provided three sets of findings to the SNFP contractor, and initiated reconciliation meetings. During the course of PAI's review, DOE directed the SNFP to develop a new baseline with a higher probability of success. The contractor transmitted the new baseline, which is referred to as the High Probability Baseline (HPB), to DOE on April 15, 1998 [Williams, 1998]. The HPB was estimated to approach a 90% confidence level on the start of fuel movement [Williams, 1998]. This high probability resulted in an increased cost and a schedule extension. To implement the new baseline, the contractor initiated 26 BCRs with supporting BOES. PAI's scope was revised on April 28 to add reviewing the HPB and the associated BCRs and BOES
Mishra-Kalyani, Pallavi S.; Johnson, Brent A.; Glass, Jonathan D.; Long, Qi
2016-09-01
Clinical disease registries offer a rich collection of valuable patient information but also pose challenges that require special care and attention in statistical analyses. The goal of this paper is to propose a statistical framework that allows for estimating the effect of surgical insertion of a percutaneous endogastrostomy (PEG) tube for patients living with amyotrophic lateral sclerosis (ALS) using data from a clinical registry. Although all ALS patients are informed about PEG, only some patients agree to the procedure which, leads to the potential for selection bias. Assessing the effect of PEG is further complicated by the aggressively fatal disease, such that time to death competes directly with both the opportunity to receive PEG and clinical outcome measurements. Our proposed methodology handles the “censoring by death” phenomenon through principal stratification and selection bias for PEG treatment through generalized propensity scores. We develop a fully Bayesian modeling approach to estimate the survivor average causal effect (SACE) of PEG on BMI, a surrogate outcome measure of nutrition and quality of life. The use of propensity score methods within the principal stratification framework demonstrates a significant and positive effect of PEG treatment, particularly when time of treatment is included in the treatment definition.
International Nuclear Information System (INIS)
Maraman, W.J.
1980-04-01
This formal monthly report covers the studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of the Los Alamos Scientific Laboratory. The two programs involved are the general-purpose heat source development and space nuclear safety and fuels. Most of the studies discussed here are of a continuing nature. Results and conclusions described may change as the work continues. Published reference to the results cited in this report should not be made without the explicit permission of the person in charge of the work
Directory of Open Access Journals (Sweden)
Xiuchun Li
2013-01-01
Full Text Available When the parameters of both drive and response systems are all unknown, an adaptive sliding mode controller, strongly robust to exotic perturbations, is designed for realizing generalized function projective synchronization. Sliding mode surface is given and the controlled system is asymptotically stable on this surface with the passage of time. Based on the adaptation laws and Lyapunov stability theory, an adaptive sliding controller is designed to ensure the occurrence of the sliding motion. Finally, numerical simulations are presented to verify the effectiveness and robustness of the proposed method even when both drive and response systems are perturbed with external disturbances.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Hong; Kong, Vic [Department of Radiation Oncology, Georgia Regents University, Augusta, Georgia 30912 (United States); Ren, Lei; Giles, William; Zhang, You [Department of Radiation Oncology, Duke University, Durham, North Carolina 27710 (United States); Jin, Jian-Yue, E-mail: jjin@gru.edu [Department of Radiation Oncology, Georgia Regents University, Augusta, Georgia 30912 and Department of Radiology, Georgia Regents University, Augusta, Georgia 30912 (United States)
2016-01-15
Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the
International Nuclear Information System (INIS)
Zhang, Hong; Kong, Vic; Ren, Lei; Giles, William; Zhang, You; Jin, Jian-Yue
2016-01-01
Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the
The BSM-AI project: SUSY-AI-generalizing LHC limits on supersymmetry with machine learning
Energy Technology Data Exchange (ETDEWEB)
Caron, Sascha [Radboud Universiteit, Institute for Mathematics, Astro- and Particle Physics IMAPP, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Kim, Jong Soo [UAM/CSIC, Instituto de Fisica Teorica, Madrid (Spain); Rolbiecki, Krzysztof [UAM/CSIC, Instituto de Fisica Teorica, Madrid (Spain); University of Warsaw, Faculty of Physics, Warsaw (Poland); Ruiz de Austri, Roberto [IFIC-UV/CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Stienen, Bob [Radboud Universiteit, Institute for Mathematics, Astro- and Particle Physics IMAPP, Nijmegen (Netherlands)
2017-04-15
A key research question at the Large Hadron Collider is the test of models of new physics. Testing if a particular parameter set of such a model is excluded by LHC data is a challenge: it requires time consuming generation of scattering events, simulation of the detector response, event reconstruction, cross section calculations and analysis code to test against several hundred signal regions defined by the ATLAS and CMS experiments. In the BSM-AI project we approach this challenge with a new idea. A machine learning tool is devised to predict within a fraction of a millisecond if a model is excluded or not directly from the model parameters. A first example is SUSY-AI, trained on the phenomenological supersymmetric standard model (pMSSM). About 300, 000 pMSSM model sets - each tested against 200 signal regions by ATLAS - have been used to train and validate SUSY-AI. The code is currently able to reproduce the ATLAS exclusion regions in 19 dimensions with an accuracy of at least 93%. It has been validated further within the constrained MSSM and the minimal natural supersymmetric model, again showing high accuracy. SUSY-AI and its future BSM derivatives will help to solve the problem of recasting LHC results for any model of new physics. SUSY-AI can be downloaded from http://susyai.hepforge.org/. An on-line interface to the program for quick testing purposes can be found at http://www.susy-ai.org/. (orig.)
The BSM-AI project: SUSY-AI-generalizing LHC limits on supersymmetry with machine learning
International Nuclear Information System (INIS)
Caron, Sascha; Kim, Jong Soo; Rolbiecki, Krzysztof; Ruiz de Austri, Roberto; Stienen, Bob
2017-01-01
A key research question at the Large Hadron Collider is the test of models of new physics. Testing if a particular parameter set of such a model is excluded by LHC data is a challenge: it requires time consuming generation of scattering events, simulation of the detector response, event reconstruction, cross section calculations and analysis code to test against several hundred signal regions defined by the ATLAS and CMS experiments. In the BSM-AI project we approach this challenge with a new idea. A machine learning tool is devised to predict within a fraction of a millisecond if a model is excluded or not directly from the model parameters. A first example is SUSY-AI, trained on the phenomenological supersymmetric standard model (pMSSM). About 300, 000 pMSSM model sets - each tested against 200 signal regions by ATLAS - have been used to train and validate SUSY-AI. The code is currently able to reproduce the ATLAS exclusion regions in 19 dimensions with an accuracy of at least 93%. It has been validated further within the constrained MSSM and the minimal natural supersymmetric model, again showing high accuracy. SUSY-AI and its future BSM derivatives will help to solve the problem of recasting LHC results for any model of new physics. SUSY-AI can be downloaded from http://susyai.hepforge.org/. An on-line interface to the program for quick testing purposes can be found at http://www.susy-ai.org/. (orig.)
International Nuclear Information System (INIS)
Hampel, G.; Poss, G.; Frohlich, H.K.
1989-10-01
The objective of the project was to draw up an instrumentation plan for the French core melting programme PHEBUS FP. This instrumentation plan essentially was to include proven and reliable instruments for recording various thermohydraulic, aerosol and hydrogen phenomena. The candidate measuring methods, which are known mainly from reactor safety programmes, have been described and examined for their usefulness in PHEBUS. Each method and instrument has been described in detail under various aspects such as measuring principle, measuring range, technical design, evaluation model, calibration procedure, accuracy, previous experience, commercial availability, etc. Special attention has been paid to the behaviour of the measuring transducers when exposed to radiation. First, the performance of the instruments was compared with the requirements of PHEBUS. The results of this comparison served as the basis for a measuring concept in tabular form, giving the locations of the measurements, the measuring tasks, and the number and kind of instruments that are recommended. Redundancy and cost-benefit aspects have been taken into account in qualitative terms
The General Education Astronomy Source (GEAS) Project: Extending the Reach of Astronomy Education
Vogt, N. P.; Muise, A. S.
2014-07-01
We present a set of NASA and NSF sponsored resources to aid in teaching astronomy remotely and in the classroom at the college level, with usage results for pilot groups of students. Our goal is to increase the accessibility of general education science coursework to underserved populations nationwide. Our materials are available for use without charge, and we are actively looking for pilot instructors. Primary components of our program include an interactive online tutorial program with over 12,000 questions, an instructor review interface, a set of hands-on and imaging- and spectra-driven laboratory exercises, including video tutorials, and interviews with diverse individuals working in STEM fields to help combat stereotypes. We discuss learning strategies often employed by students without substantial scientific training and suggest ways to incorporate them into a framework based on the scientific method and techniques for data analysis, and we compare cohorts of in-class and distance-education students.
International Nuclear Information System (INIS)
Setty, D.S.; Rameswara Roa, A.; Hemantha Rao, G.V.S.; Jaya Raj, R.N.
2008-01-01
In the Pressurized Heavy Water Reactor (PHWR) fuel manufacturing, zirconium alloy appendages like spacer and bearing pads are welded to the thin wall zirconium alloy fuel tubes by using resistance projection welding process. Out of many joining processes available, resistance-welding process is reliable, environment friendly and best suitable for mass production applications. In the fuel assembly, spacer pads are used to get the required inter-element spacing and Bearing pads are used to get the required load-bearing surface for the fuel assembly. Performance of the fuel assembly in the reactor is greatly influenced by these weld joint's quality. Phase transformation from α to β phase is not acceptable while welding these tiny appendages. At present only destructive metallography test is available for this purpose. This can also be achieved by measuring weld nugget temperature where in the phase transformation temperature for zirconium alloy material is 853 o C. The temperature distribution during resistance welding of tiny parts cannot be measured by conventional methods due to very small space and short weld times involved in the process. Shear strength, dimensional accuracy and weld microstructures are some of the key parameters used to measure the quality of appendage weld joints. Weld parameters were optimized with the help of industrial experimentation methodology. Individual projection welding by split electrode concept, and during welding on empty tube firm support is achieved on inner side of the tube by using expandable pneumatic mandrel. In the present paper, an attempt was made to measure the weld nugget temperature by thermography technique and is correlated with standard microstructures of zirconium alloy material. The temperature profiles in the welding process are presented for different welding conditions. This technique has helped in measuring the weld nugget temperature more accurately. It was observed that in the present appendage welding
Global prevalence of diabetes: estimates for 2000 and projections for 2030
DEFF Research Database (Denmark)
Wild, Sarah; Roglic, Gojka; Green, Anders
2004-01-01
. The most important demographic change to diabetes prevalence across the world appears to be the increase in the proportion of people 65 years of age. CONCLUSIONS — These findings indicate that the “diabetes epidemic” will continue even if levels of obesity remain constant. Given the increasing prevalence......OBJECTIVE — The goal of this study was to estimate the prevalence of diabetes and the number of people of all ages with diabetes for years 2000 and 2030. RESEARCH DESIGN AND METHODS — Data on diabetes prevalence by age and sex from a limited number of countries were extrapolated to all 191 World...... of obesity, it is likely that these figures provide an underestimate of future diabetes prevalence....
The Dunedin Dementia Risk Awareness Project: a convenience sample of general practitioners.
Barak, Yoram; Rapsey, Charlene; Fridman, Dana; Scott, Kate
2018-05-04
Recent recommendations of US and UK governmental and academic agencies suggest that up to 35% of dementia cases are preventable. We aimed to appraise general practitioners' (GPs) awareness of risk and protective factors associated with dementia and their intentions to act within the context of the Health Beliefs Model. We canvassed degree of dementia awareness, using the modified Lifestyle for Brain Health (LIBRA) scale among a convenience sample of local GPs. Thirty-five GPs, mean age 56.7 + 6.8 years (range: 43-72) participated. There were 19 women and 16 men, all New Zealand European. Genetics was the most commonly cited risk for dementia and exercise the most commonly cited protective factor. More than 80% of participants correctly identified 8/12 LIBRA factors. Factors not identified were: renal dysfunction, obesity, Mediterranean diet and high cognitive activity. The majority of participants felt they were at risk of suffering from dementia, that lifestyle changes will help reduce their risk and wished to start these changes soon. GPs are knowledgeable about dementia risk and protective factors. They reported optimism in their ability to modify their own risk factors through lifestyle interventions. This places GPs in a unique position to help disseminate this knowledge to their clients.
Sperber, K. R.; Palmer, T. N.
1996-11-01
The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979-88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. Evaluation of the interannual variability of a wind shear index over the summer monsoon region indicates that the models exhibit greater fidelity in capturing the large-scale dynamic fluctuations than the regional-scale rainfall variations. A rainfall/SST teleconnection quality control was used to objectively stratify model performance. Skill scores improved for those models that qualitatively simulated the observed rainfall/El Niño- Southern Oscillation SST correlation pattern. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations.A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany/National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall
International Nuclear Information System (INIS)
Friedrich, R.; Krewitt, W.; Mayerhofer, P.; Trukenmueller, A.; Gressmann, A.; Runte, K.-H.; Kortum, G.; Weltschev, M.
1994-01-01
principal objectives: - to quantify the external costs and benefits of the major fuel cycles for electricity generation and conservation, using the best available methods and information; - to adopt a common framework for assessment of fuel cycles, in order that a fair comparison can be made between them, and - to make recommendations on areas in which further research is required in order that future estimates of damages can be made with greater confidence. Within the study the following fuel cycles for electricity generation will be assessed: coal, uranium, lignite, oil, gas, wind, photovoltaics, biomass, small scale hydroelectric projects and energy conservation. The project started by considering the coal and the nuclear fuel cycles. The methodological framework established during the work on these two fuel cycles now has to be modified and transferred to other fuel cycles to demonstrate the general applicability of the accounting framework and to guarantee a consistent analysis of various fuel cycles
International Nuclear Information System (INIS)
Friedrich, R.; Krewitt, W.; Mayerhofer, P.; Trukenmueller, A.; Gressmann, A.
1994-01-01
principal objectives: - to quantify the external costs and benefits of the major fuel cycles for electricity generation and conservation, using the best available methods and information, - to adopt a common framework for assessment of fuel cycles, in order that a fair comparison can be made between them, and - to make recommendations on areas in which further research is required in order that future estimates of damages can be made with greater confidence. Within the study the following fuel cycles for electricity generation will be assessed: coal, uranium, lignite, oil, gas, wind, photovoltaics, biomass, small scale hydroelectric projects and energy conservation. The project started by considering the coal and the nuclear fuel cycles. The methodological framework established during the work on these two fuel cycles now has to be modified and transferred to other fuel cycles to demonstrate the general applicability of the accounting framework and to guarantee a consistent analysis of various fuel cycles
Lifetime estimation of a time projection chamber x-ray polarimeter
Hill, Joanne E.; Black, J. Kevin; Brieda, Lubos; Dickens, Patsy L.; Montt de Garcia, Kristina; Hawk, Douglas L.; Hayato, Asami; Jahoda, Keith; Mohammed, Jelila
2013-09-01
The Gravity and Extreme Magnetism Small Explorer (GEMS) X-ray polarimeter Instrument (XPI) was designed to measure the polarization of 23 sources over the course of its 9 month mission. The XPI design consists of two telescopes each with a polarimeter assembly at the focus of a grazing incidence mirror. To make sensitive polarization measurements the GEMS Polarimeter Assembly (PA) employed a gas detection system based on a Time Projection Chamber (TPC) technique. Gas detectors are inherently at risk of degraded performance arising from contamination from outgassing of internal detector components or due to loss of gas. This paper describes the design and the materials used to build a prototype of the flight polarimeter with the required GEMS lifetime. We report the results from outgassing measurements of the polarimeter subassemblies and assemblies, enclosure seal tests, life tests, and performance tests that demonstrate that the GEMS lifetime is achievable. Finally we report performance measurements and the lifetime enhancement from the use of a getter.
Prandi, F.; Magliocchetti, D.; Poveda, A.; De Amicis, R.; Andreolli, M.; Devigili, F.
2016-06-01
Forests represent an important economic resource for mountainous areas being for a few region and mountain communities the main form of income. However, wood chain management in these contexts differs from the traditional schemes due to the limits imposed by terrain morphology, both for the operation planning aspects and the hardware requirements. In fact, forest organizational and technical problems require a wider strategic and detailed level of planning to reach the level of productivity of forest operation techniques applied on flatlands. In particular, a perfect knowledge of forest inventories improves long-term management sustainability and efficiency allowing a better understanding of forest ecosystems. However, this knowledge is usually based on historical parcel information with only few cases of remote sensing information from satellite imageries. This is not enough to fully exploit the benefit of the mountain areas forest stocks where the economic and ecological value of each single parcel depends on singletree characteristics. The work presented in this paper, based on the results of the SLOPE (Integrated proceSsing and controL systems fOr sustainable forest Production in mountain arEas) project, investigates the capability to generate, manage and visualize detailed virtual forest models using geospatial information, combining data acquired from traditional on-the-field laser scanning surveys technologies with new aerial survey through UAV systems. These models are then combined with interactive 3D virtual globes for continuous assessment of resource characteristics, harvesting planning and real-time monitoring of the whole production.
Directory of Open Access Journals (Sweden)
F. Prandi
2016-06-01
Full Text Available Forests represent an important economic resource for mountainous areas being for a few region and mountain communities the main form of income. However, wood chain management in these contexts differs from the traditional schemes due to the limits imposed by terrain morphology, both for the operation planning aspects and the hardware requirements. In fact, forest organizational and technical problems require a wider strategic and detailed level of planning to reach the level of productivity of forest operation techniques applied on flatlands. In particular, a perfect knowledge of forest inventories improves long-term management sustainability and efficiency allowing a better understanding of forest ecosystems. However, this knowledge is usually based on historical parcel information with only few cases of remote sensing information from satellite imageries. This is not enough to fully exploit the benefit of the mountain areas forest stocks where the economic and ecological value of each single parcel depends on singletree characteristics. The work presented in this paper, based on the results of the SLOPE (Integrated proceSsing and controL systems fOr sustainable forest Production in mountain arEas project, investigates the capability to generate, manage and visualize detailed virtual forest models using geospatial information, combining data acquired from traditional on-the-field laser scanning surveys technologies with new aerial survey through UAV systems. These models are then combined with interactive 3D virtual globes for continuous assessment of resource characteristics, harvesting planning and real-time monitoring of the whole production.
2014-03-01
The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...
Performance Estimation of Networked Business Models: Case Study on a Finnish eHealth Service Project
Directory of Open Access Journals (Sweden)
Marikka Heikkilä
2014-08-01
Full Text Available Purpose: The objective of this paper is to propose and demonstrate a framework for estimating performance in a networked business model. Design/methodology/approach: Our approach is design science, utilising action research in studying a case of four independent firms in Health & Wellbeing sector aiming to jointly provide a new service for business and private customers. The duration of the research study is 3 years. Findings: We propose that a balanced set of performance indicators can be defined by paying attention to all main components of the business model, enriched with of network collaboration. The results highlight the importance of measuring all main components of the business model and also the business network partners’ view on trust, contracts and fairness. Research implications: This article contributes to the business model literature by combining business modelling with performance evaluation. The article points out that it is essential to create metrics that can be applied to evaluate and improve the business model blueprints, but it is also important to measure business collaboration aspects. Practical implications: Companies have already adopted Business model canvas or similar business model tools to innovate new business models. We suggest that companies continue their business model innovation work by agreeing on a set of performance metrics, building on the business model components model enriched with measures of network collaboration. Originality/value: This article contributes to the business model literature and praxis by combining business modelling with performance evaluation.
A critical review of the ESCAPE project for estimating long-term health effects of air pollution.
Lipfert, Frederick W
2017-02-01
The European Study of Cohorts for Air Pollution Effects (ESCAPE) is a13-nation study of long-term health effects of air pollution based on subjects pooled from up to 22 cohorts that were intended for other purposes. Twenty-five papers have been published on associations of various health endpoints with long-term exposures to NOx, NO2, traffic indicators, PM10, PM2.5 and PM constituents including absorbance (elemental carbon). Seven additional ESCAPE papers found moderate correlations (R2=0.3-0.8) between measured air quality and estimates based on land-use regression that were used; personal exposures were not considered. I found no project summaries or comparisons across papers; here I conflate the 25 ESCAPE findings in the context of other recent European epidemiology studies. Because one ESCAPE cohort contributed about half of the subjects, I consider it and the other 18 cohorts separately to compare their contributions to the combined risk estimates. I emphasize PM2.5 and confirm the published hazard ratio of 1.14 (1.04-1.26) per 10μg/m3 for all-cause mortality. The ESCAPE papers found 16 statistically significant (p<0.05) risks among the125 pollutant-endpoint combinations; 4 each for PM2.5 and PM10, 1 for PM absorbance, 5 for NO2, and 2 for traffic. No PM constituent was consistently significant. No significant associations were reported for cardiovascular mortality; low birthrate was significant for all pollutants except PM absorbance. Based on associations with PM2.5, I find large differences between all-cause death estimates and the sum of specific-cause death estimates. Scatterplots of PM2.5 mortality risks by cause show no consistency across the 18 cohorts, ostensibly because of the relatively few subjects. Overall, I find the ESCAPE project inconclusive and I question whether the efforts required to estimate exposures for small cohorts were worthwhile. I suggest that detailed studies of the large cohort using historical exposures and additional
Adams, Vanessa M.; Setterfield, Samantha A.
2013-06-01
Financial mechanisms such as offsets are one strategy to abate greenhouse gas emissions, and the carbon market is expanding with a growing demand for offset products. However, in the case of carbon offsets, if the carbon is released due to intentional or unintentional reversal through environmental events such as fire, the financial liability to replace lost offsets will likely fall on the provider. This liability may have implications for future participation in programmes, but common strategies such as buffer pool and insurance products can be used to minimize this liability. In order for these strategies to be effective, an understanding of the spatial and temporal distributions of expected reversals is needed. We use the case study of savanna burning, an approved greenhouse gas abatement methodology under the Carbon Farming Initiative in Australia, to examine potential risks to carbon markets in northern Australia and quantify the financial risks. We focus our analysis on the threat of Andropogon gayanus (gamba grass) to savanna burning due to its documented impacts of increased fuel loads and altered fire regimes. We assess the spatial and financial extent to which gamba grass poses a risk to savanna burning programmes in northern Australia. We find that 75% of the eligible area for savanna burning is spatially coincident with the high suitability range for gamba grass. Our analysis demonstrates that the presence of gamba grass seriously impacts the financial viability of savanna burning projects. For example, in order to recuperate the annual costs of controlling 1 ha of gamba grass infestation, 290 ha of land must be enrolled in annual carbon abatement credits. Our results show an immediate need to contain gamba grass to its current extent to avoid future spread into large expanses of land, which are currently profitable for savanna burning.
International Nuclear Information System (INIS)
Adams, Vanessa M; Setterfield, Samantha A
2013-01-01
Financial mechanisms such as offsets are one strategy to abate greenhouse gas emissions, and the carbon market is expanding with a growing demand for offset products. However, in the case of carbon offsets, if the carbon is released due to intentional or unintentional reversal through environmental events such as fire, the financial liability to replace lost offsets will likely fall on the provider. This liability may have implications for future participation in programmes, but common strategies such as buffer pool and insurance products can be used to minimize this liability. In order for these strategies to be effective, an understanding of the spatial and temporal distributions of expected reversals is needed. We use the case study of savanna burning, an approved greenhouse gas abatement methodology under the Carbon Farming Initiative in Australia, to examine potential risks to carbon markets in northern Australia and quantify the financial risks. We focus our analysis on the threat of Andropogon gayanus (gamba grass) to savanna burning due to its documented impacts of increased fuel loads and altered fire regimes. We assess the spatial and financial extent to which gamba grass poses a risk to savanna burning programmes in northern Australia. We find that 75% of the eligible area for savanna burning is spatially coincident with the high suitability range for gamba grass. Our analysis demonstrates that the presence of gamba grass seriously impacts the financial viability of savanna burning projects. For example, in order to recuperate the annual costs of controlling 1 ha of gamba grass infestation, 290 ha of land must be enrolled in annual carbon abatement credits. Our results show an immediate need to contain gamba grass to its current extent to avoid future spread into large expanses of land, which are currently profitable for savanna burning. (letter)
DEFF Research Database (Denmark)
Rasmussen, Troels A.; Merritt, Timothy R.
2017-01-01
CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...
Bird, D. N.; Kunda, M.; Mayer, A.; Schlamadinger, B.; Canella, L.; Johnston, M.
2008-04-01
Some climate scientists are questioning whether the practice of converting of non-forest lands to forest land (afforestation or reforestation) is an effective climate change mitigation option. The discussion focuses particularly on areas where the new forest is primarily coniferous and there is significant amount of snow since the increased climate forcing due to the change in albedo may counteract the decreased climate forcing due to carbon dioxide removal. In this paper, we develop a stand-based model that combines changes in surface albedo, solar radiation, latitude, cloud cover and carbon sequestration. As well, we develop a procedure to convert carbon stock changes to equivalent climatic forcing or climatic forcing to equivalent carbon stock changes. Using the model, we investigate the sensitivity of combined affects of changes in surface albedo and carbon stock changes to model parameters. The model is sensitive to amount of cloud, atmospheric absorption, timing of canopy closure, carbon sequestration rate among other factors. The sensitivity of the model is investigated at one Canadian site, and then the model is tested at numerous sites across Canada. In general, we find that the change in albedo reduces the carbon sequestration benefits by approximately 30% over 100 years, but this is not drastic enough to suggest that one should not use afforestation or reforestation as a climate change mitigation option. This occurs because the forests grow in places where there is significant amount of cloud in winter. As well, variations in sequestration rate seem to be counterbalanced by the amount and timing of canopy closure. We close by speculating that the effects of albedo may also be significant in locations at lower latitudes, where there are less clouds, and where there are extended dry seasons. These conditions make grasses light coloured and when irrigated crops, dark forests or other vegetation such as biofuels replace the grasses, the change in carbon
Laurence, Caroline O; Heywood, Troy; Bell, Janice; Atkinson, Kaye; Karnon, Jonathan
2018-03-27
Health workforce planning models have been developed to estimate the future health workforce requirements for a population whom they serve and have been used to inform policy decisions. To adapt and further develop a need-based GP workforce simulation model to incorporate current and estimated geographic distribution of patients and GPs. A need-based simulation model that estimates the supply of GPs and levels of services required in South Australia (SA) was adapted and applied to the Western Australian (WA) workforce. The main outcome measure was the differences in the number of full-time equivalent (FTE) GPs supplied and required from 2013 to 2033. The base scenario estimated a shortage of GPs in WA from 2019 onwards with a shortage of 493 FTE GPs in 2033, while for SA, estimates showed an oversupply over the projection period. The WA urban and rural models estimated an urban shortage of GPs over this period. A reduced international medical graduate recruitment scenario resulted in estimated shortfalls of GPs by 2033 for WA and SA. The WA-specific scenarios of lower population projections and registrar work value resulted in a reduced shortage of FTE GPs in 2033, while unfilled training places increased the shortfall of FTE GPs in 2033. The simulation model incorporates contextual differences to its structure that allows within and cross jurisdictional comparisons of workforce estimations. It also provides greater insights into the drivers of supply and demand and the impact of changes in workforce policy, promoting more informed decision-making.
Petitta, Marcello; Wagner, Jochen; Costa, Armin; Monsorno, Roberto; Innerebner, Markus; Moser, David; Zebisch, Marc
2014-05-01
The scientific community in the last years is largely discussing the concept of "Climate services". Several definitions have been used, but it still remains a rather open concept. We used climate data from analysis and reanalysis to create a daily and hourly model of atmospheric turbidity in order to account the effect of the atmosphere on incoming solar radiation with the final aim of estimating electric production from Photovoltaic (PV) Modules in the Alps. Renewable Energy production in the Alpine Region is dominated by hydroelectricity, but the potential for photovoltaic energy production is gaining momentum. Especially the southern part of the Alps and inner Alpine regions offer good conditions for PV energy production. The combination of high irradiance values and cold air temperature in mountainous regions is well suited for solar cells. To enable more widespread currency of PV plants, PV has to become an important part in regional planning. To provide regional authorities and also private stakeholders with high quality PV energy yield climatology in the provinces of Bolzano/Bozen South Tirol (Italy) and Tyrol (Austria), the research project Solar Tyrol was inaugurated in 2012. Several methods are used to calculate very high resolution maps of solar radiation. Most of these approaches use climatological values. In this project we reconstructed the last 10 years of atmospheric turbidity using reanalysis and operational data in order to better estimate incoming solar radiation in the alpine region. Our method is divided into three steps: i) clear sky radiation: to estimate the atmospheric effect on solar radiation we calculated Linke Turbidity factor using aerosols optical depth (AOD), surface albedo, atmospheric pressure, and total water content from ECMWF and MACC analysis. ii) shadows: we calculated shadows of mountains and buildings using a 2 meter-resolution digital elevation model of the area and GIS module r.sun modified to fit our specific needs. iii
Global Mortality Estimates for the 2009 Influenza Pandemic from the GLaMOR Project: A Modeling Study
Simonsen, Lone; Spreeuwenberg, Peter; Lustig, Roger; Taylor, Robert J.; Fleming, Douglas M.; Kroneman, Madelon; Van Kerkhove, Maria D.; Mounts, Anthony W.; Paget, W. John
2013-01-01
Background Assessing the mortality impact of the 2009 influenza A H1N1 virus (H1N1pdm09) is essential for optimizing public health responses to future pandemics. The World Health Organization reported 18,631 laboratory-confirmed pandemic deaths, but the total pandemic mortality burden was substantially higher. We estimated the 2009 pandemic mortality burden through statistical modeling of mortality data from multiple countries. Methods and Findings We obtained weekly virology and underlying cause-of-death mortality time series for 2005–2009 for 20 countries covering ∼35% of the world population. We applied a multivariate linear regression model to estimate pandemic respiratory mortality in each collaborating country. We then used these results plus ten country indicators in a multiple imputation model to project the mortality burden in all world countries. Between 123,000 and 203,000 pandemic respiratory deaths were estimated globally for the last 9 mo of 2009. The majority (62%–85%) were attributed to persons under 65 y of age. We observed a striking regional heterogeneity, with almost 20-fold higher mortality in some countries in the Americas than in Europe. The model attributed 148,000–249,000 respiratory deaths to influenza in an average pre-pandemic season, with only 19% in persons mortality was ∼10-fold higher than the World Health Organization's laboratory-confirmed mortality count. Although the pandemic mortality estimate was similar in magnitude to that of seasonal influenza, a marked shift toward mortality among persons mortality and hospitalization surveillance data is needed to rapidly establish the severity of future pandemics. Please see later in the article for the Editors' Summary PMID:24302890
International Nuclear Information System (INIS)
Ghaderi, A.; Landro, M.; Ghaderi, A.
2005-01-01
Carbon dioxide (CO 2 ) is being injected into a shallow sand formation at around a 1,000 metre depth at the Sleipner Field located in the North Sea. It is expected that the CO 2 injected in the bottom of the formation, will form a plume consisting of CO 2 accumulating in thin lenses during migration up through the reservoir. Several studies have been published using stacked seismic data from 1994, 1999, 2001 and 2002. A thorough analysis of post-stack seismic data from the Sleipner CO2-Sequestration Pilot Project was conducted. Interpretation of seismic data is usually done on post-stack data. For a given subsurface reflection point, seismic data are acquired for various incidence angles, typically 40 angles. These 40 seismic signals are stacked together in order to reduce noise. The term pre-stack refers to seismic data prior to this step. For hydrocarbon-related 4-dimensional seismic studies, travel time shift estimations have been used. This paper compared pre-stack and post-stack estimation of average velocity changes based on measured 4-dimensional travel time shifts. It is more practical to compare estimated velocity changes than the actual travel time changes, since the time shifts vary with offset for pre-stack time-lapse seismic analysis. It was concluded that the pre-stack method gives smaller velocity changes when estimated between two key horizons. Therefore, pre-stack travel time analysis in addition to conventional post-stack analysis is recommended. 6 refs., 12 figs
Directory of Open Access Journals (Sweden)
Amlan Kumar Patra
2014-04-01
Full Text Available This study presents trends and projected estimates of methane and nitrous oxide emissions from livestock of India vis-à-vis world and developing countries over the period 1961 to 2010 estimated based on IPCC guidelines. World enteric methane emission (EME increased by 54.3% (61.5 to 94.9 ×109 kg annually from the year 1961 to 2010, and the highest annual growth rate (AGR was noted for goat (2.0%, followed by buffalo (1.57% and swine (1.53%. Global EME is projected to increase to 120×109 kg by 2050. The percentage increase in EME by Indian livestock was greater than world livestock (70.6% vs 54.3% between the years 1961 to 2010, and AGR was highest for goat (1.91%, followed by buffalo (1.55%, swine (1.28%, sheep (1.25% and cattle (0.70%. In India, total EME was projected to grow by 18.8×109 kg in 2050. Global methane emission from manure (MEM increased from 6.81 ×109 kg in 1961 to 11.4×109 kg in 2010 (an increase of 67.6%, and is projected to grow to 15×109 kg by 2050. In India, the annual MEM increased from 0.52×109 kg to 1.1×109 kg (with an AGR of 1.57% in this period, which could increase to 1.54×109 kg in 2050. Nitrous oxide emission from manure in India could be 21.4×106 kg in 2050 from 15.3×106 kg in 2010. The AGR of global GHG emissions changed a small extent (only 0.11% from developed countries, but increased drastically (1.23% for developing countries between the periods of 1961 to 2010. Major contributions to world GHG came from cattle (79.3%, swine (9.57% and sheep (7.40%, and for developing countries from cattle (68.3%, buffalo (13.7% and goat (5.4%. The increase of GHG emissions by Indian livestock was less (74% vs 82% over the period of 1961 to 2010 than the developing countries. With this trend, world GHG emissions could reach 3,520×109 kg CO2-eq by 2050 due to animal population growth driven by increased demands for meat and dairy products in the world.
Barth, Timothy J.
2014-01-01
This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.
Directory of Open Access Journals (Sweden)
Konstantine K. Mamberger
2012-11-01
Full Text Available Aims This paper treats general problems of metrology and indirect measurement methods in cardiology. It is aimed at an identification of error estimation criteria for indirect measurements of heart cycle phase durations. Materials and methods A comparative analysis of an ECG of the ascending aorta recorded with the use of the Hemodynamic Analyzer Cardiocode (HDA lead versus conventional V3, V4, V5, V6 lead system ECGs is presented herein. Criteria for heart cycle phase boundaries are identified with graphic mathematical differentiation. Stroke volumes of blood SV calculated on the basis of the HDA phase duration measurements vs. echocardiography data are compared herein. Results The comparative data obtained in the study show an averaged difference at the level of 1%. An innovative noninvasive measuring technology originally developed by a Russian R & D team offers measuring stroke volume of blood SV with a high accuracy. Conclusion In practice, it is necessary to take into account some possible errors in measurements caused by hardware. Special attention should be paid to systematic errors.
Directory of Open Access Journals (Sweden)
Lone Simonsen
2013-11-01
Full Text Available Assessing the mortality impact of the 2009 influenza A H1N1 virus (H1N1pdm09 is essential for optimizing public health responses to future pandemics. The World Health Organization reported 18,631 laboratory-confirmed pandemic deaths, but the total pandemic mortality burden was substantially higher. We estimated the 2009 pandemic mortality burden through statistical modeling of mortality data from multiple countries.We obtained weekly virology and underlying cause-of-death mortality time series for 2005-2009 for 20 countries covering ∼35% of the world population. We applied a multivariate linear regression model to estimate pandemic respiratory mortality in each collaborating country. We then used these results plus ten country indicators in a multiple imputation model to project the mortality burden in all world countries. Between 123,000 and 203,000 pandemic respiratory deaths were estimated globally for the last 9 mo of 2009. The majority (62%-85% were attributed to persons under 65 y of age. We observed a striking regional heterogeneity, with almost 20-fold higher mortality in some countries in the Americas than in Europe. The model attributed 148,000-249,000 respiratory deaths to influenza in an average pre-pandemic season, with only 19% in persons <65 y. Limitations include lack of representation of low-income countries among single-country estimates and an inability to study subsequent pandemic waves (2010-2012.We estimate that 2009 global pandemic respiratory mortality was ∼10-fold higher than the World Health Organization's laboratory-confirmed mortality count. Although the pandemic mortality estimate was similar in magnitude to that of seasonal influenza, a marked shift toward mortality among persons <65 y of age occurred, so that many more life-years were lost. The burden varied greatly among countries, corroborating early reports of far greater pandemic severity in the Americas than in Australia, New Zealand, and Europe. A
Energy Technology Data Exchange (ETDEWEB)
Kurnik, Charles W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Violette, Daniel M. [Navigant, Boulder, CO (United States); Rathbun, Pamela [Tetra Tech, Madison, WI (United States)
2017-11-02
This chapter focuses on the methods used to estimate net energy savings in evaluation, measurement, and verification (EM and V) studies for energy efficiency (EE) programs. The chapter provides a definition of net savings, which remains an unsettled topic both within the EE evaluation community and across the broader public policy evaluation community, particularly in the context of attribution of savings to a program. The chapter differs from the measure-specific Uniform Methods Project (UMP) chapters in both its approach and work product. Unlike other UMP resources that provide recommended protocols for determining gross energy savings, this chapter describes and compares the current industry practices for determining net energy savings but does not prescribe methods.
Majumdar, A. K.; Hedayat, A.
2015-01-01
This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.
Mededovic Thagard, Selma; Stratton, Gunnar R.; Dai, Fei; Bellona, Christopher L.; Holsen, Thomas M.; Bohl, Douglas G.; Paek, Eunsu; Dickenson, Eric R. V.
2017-01-01
contributions from the three general mechanisms, it was determined that surface concentration is the dominant factor determining a compound’s treatability. These insights indicate that PWT would be most viable for the treatment of surfactant-like contaminants. , which features invited work from the best early-career researchers working within the scope of J. Phys. D. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Selma Mededovic Thagard was selected by the Editorial Board of J. Phys. D as an Leader.
Vichit-Vadakan, Nuntavarn; Vajanapoom, Nitaya; Ostro, Bart
2008-09-01
Air pollution data in Bangkok, Thailand, indicate that levels of particulate matter with aerodynamic diameter air pollution in Bangkok, Thailand. The study period extended from 1999 to 2003, for which the Ministry of Public Health provided the mortality data. Measures of air pollution were derived from air monitoring stations, and information on temperature and relative humidity was obtained from the weather station in central Bangkok. The statistical analysis followed the common protocol for the multicity PAPA (Public Health and Air Pollution Project in Asia) project in using a natural cubic spline model with smooths of time and weather. The excess risk for non-accidental mortality was 1.3% [95% confidence interval (CI), 0.8-1.7] per 10 microg/m(3) of PM(10), with higher excess risks for cardiovascular and above age 65 mortality of 1.9% (95% CI, 0.8-3.0) and 1.5% (95% CI, 0.9-2.1), respectively. In addition, the effects from PM(10) appear to be consistent in multipollutant models. The results suggest strong associations between several different mortality outcomes and PM(10). In many cases, the effect estimates were higher than those typically reported in Western industrialized nations.
International Nuclear Information System (INIS)
Hurtado, A.; Eguilior, S.; Recreo, F.
2015-01-01
From the consideration of a contemporary society based on the need of a high-level complex technology with a high intrinsic level of uncertainty and its relationship with risk assessment, this analysis, conducted in late 2014, was developed from that that led the Secretary of State for the Environment to the Resolution of 29 May 2014, by which the Environmental Impact Statement of the Exploratory Drilling Project in the hydrocarbons research permits called ''Canarias 1-9// was set out and published in the Spanish Official State Gazette number 196 on 13rd August 2014. The aim of the present study is to analyze the suitability with which the worst case associated probability is identified and defined and its relation to the total risk estimate from a blow out. Its interest stems from the fact that all risk management methodologically rests on two pillars, i.e., on a sound risk analysis and evaluation. This determines the selection of management tools in relation to its level of complexity, the project phase and its potential impacts on the health, safety and environmental contamination dimensions.
Directory of Open Access Journals (Sweden)
Kotsyuba Oleksiy S.
2018-02-01
Full Text Available The article is concerned with the methodology of economic substantiation of real investments in case of considerable lack of information on possible fluctuations of initial parameters and the resulting risk. The analysis of sensitivity as the main instrument for accounting the risk in the indicated problem situation is the focus of the presented research. In the publication, on the basis of the apparatus of interval mathematics, a set of models for comparative estimation of economic attractiveness (efficiency of alternative investment projects in conditions of uncertainty and risk is formulated, using the sensitivity analysis. The developed instrumentarium assumes both mono- and poly-interval version of the sensitivity analysis. As the risk component in the constructed models is used: in some – values of the specially developed sensitivity coefficient, in others – the worst values, which are based on the interval estimations of the partial criteria of efficiency. The sensitivity coefficient, according to the approach proposed in the publication, is the ratio of the target semi-range of variation to the increase (economy of efficiency, which is provided when the basic level of the analyzed partial criterion of economic attractiveness in comparison with some of its threshold (limit value is being reached.
Directory of Open Access Journals (Sweden)
Plebankiewicz E.
2015-09-01
Full Text Available The article presents briefly several methods of working time estimation. However, three methods of task duration assessment have been selected to investigate working time in a real construction project using the data collected from observing workers laying terrazzo flooring in staircases. The first estimation has been done by calculating a normal and a triangular function. The next method, which is the focus of greatest attention here, is PERT. The article presents a way to standardize the results and the procedure algorithm allowing determination of the characteristic values for the method. Times to perform every singular component sub-task as well as the whole task have been defined for the collected data with the reliability level of 85%. The completion time of the same works has also been calculated with the use of the KNR. The obtained result is much higher than the actual time needed for execution of the task calculated with the use of the previous method. The authors argue that PERT is the best method of all three, because it takes into account the randomness of the entire task duration and it can be based on the actual execution time known from research.
Xiaodong Zhuge; Palenstijn, Willem Jan; Batenburg, Kees Joost
2016-01-01
In this paper, we present a novel iterative reconstruction algorithm for discrete tomography (DT) named total variation regularized discrete algebraic reconstruction technique (TVR-DART) with automated gray value estimation. This algorithm is more robust and automated than the original DART algorithm, and is aimed at imaging of objects consisting of only a few different material compositions, each corresponding to a different gray value in the reconstruction. By exploiting two types of prior knowledge of the scanned object simultaneously, TVR-DART solves the discrete reconstruction problem within an optimization framework inspired by compressive sensing to steer the current reconstruction toward a solution with the specified number of discrete gray values. The gray values and the thresholds are estimated as the reconstruction improves through iterations. Extensive experiments from simulated data, experimental μCT, and electron tomography data sets show that TVR-DART is capable of providing more accurate reconstruction than existing algorithms under noisy conditions from a small number of projection images and/or from a small angular range. Furthermore, the new algorithm requires less effort on parameter tuning compared with the original DART algorithm. With TVR-DART, we aim to provide the tomography society with an easy-to-use and robust algorithm for DT.
Verhulst, Kristal R.; Karion, Anna; Kim, Jooil; Salameh, Peter K.; Keeling, Ralph F.; Newman, Sally; Miller, John; Sloop, Christopher; Pongetti, Thomas; Rao, Preeti; Wong, Clare; Hopkins, Francesca M.; Yadav, Vineet; Weiss, Ray F.; Duren, Riley M.; Miller, Charles E.
2017-07-01
We report continuous surface observations of carbon dioxide (CO2) and methane (CH4) from the Los Angeles (LA) Megacity Carbon Project during 2015. We devised a calibration strategy, methods for selection of background air masses, calculation of urban enhancements, and a detailed algorithm for estimating uncertainties in urban-scale CO2 and CH4 measurements. These methods are essential for understanding carbon fluxes from the LA megacity and other complex urban environments globally. We estimate background mole fractions entering LA using observations from four extra-urban sites including two marine sites located south of LA in La Jolla (LJO) and offshore on San Clemente Island (SCI), one continental site located in Victorville (VIC), in the high desert northeast of LA, and one continental/mid-troposphere site located on Mount Wilson (MWO) in the San Gabriel Mountains. We find that a local marine background can be established to within ˜ 1 ppm CO2 and ˜ 10 ppb CH4 using these local measurement sites. Overall, atmospheric carbon dioxide and methane levels are highly variable across Los Angeles. Urban and suburban sites show moderate to large CO2 and CH4 enhancements relative to a marine background estimate. The USC (University of Southern California) site near downtown LA exhibits median hourly enhancements of ˜ 20 ppm CO2 and ˜ 150 ppb CH4 during 2015 as well as ˜ 15 ppm CO2 and ˜ 80 ppb CH4 during mid-afternoon hours (12:00-16:00 LT, local time), which is the typical period of focus for flux inversions. The estimated measurement uncertainty is typically better than 0.1 ppm CO2 and 1 ppb CH4 based on the repeated standard gas measurements from the LA sites during the last 2 years, similar to Andrews et al. (2014). The largest component of the measurement uncertainty is due to the single-point calibration method; however, the uncertainty in the background mole fraction is much larger than the measurement uncertainty. The background uncertainty for the marine
Rosenblum, Michael; van der Laan, Mark J.
2010-01-01
Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636
Rosenblum, Michael; van der Laan, Mark J
2010-04-01
Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.
Stritzinger, M. D.; Taddia, F.; Burns, C. R.; Phillips, M. M.; Bersten, M.; Contreras, C.; Folatelli, G.; Holmbo, S.; Hsiao, E. Y.; Hoeflich, P.; Leloudas, G.; Morrell, N.; Sollerman, J.; Suntzeff, N. B.
2018-02-01
We aim to improve upon contemporary methods to estimate host-galaxy reddening of stripped-envelope (SE) supernovae (SNe). To this end the Carnegie Supernova Project (CSP-I) SE SN photometry data release, consisting of nearly three dozen objects, is used to identify a minimally reddened sub-sample for each traditionally defined spectroscopic sub-type (i.e., SNe IIb, SNe Ib, SNe Ic). Inspection of the optical and near-infrared (NIR) colors and color evolution of the minimally reddened sub-samples reveals a high degree of homogeneity, particularly between 0 d to +20 d relative to B-band maximum. This motivated the construction of intrinsic color-curve templates, which when compared to the colors of reddened SE SNe, yields an entire suite of optical and NIR color excess measurements. Comparison of optical/optical vs. optical/NIR color excess measurements indicates the majority of the CSP-I SE SNe suffer relatively low amounts of reddening (i.e., E(B-V)host 0.20 mag) objects with the Fitzpatrick (1999, PASP, 111, 63) reddening law model provides robust estimates of the host visual-extinction AVhost and RVhost. In the case of the SE SNe with relatively low amounts of reddening, a preferred value of RVhost is adopted for each sub-type, resulting in estimates of AVhost through Fitzpatrick (1999) reddening law model fits to the observed color excess measurements. Our analysis suggests SE SNe reside in galaxies characterized by a range of dust properties. We also find evidence that SNe Ic are more likely to occur in regions characterized by larger RVhost values compared to SNe IIb/Ib and they also tend to suffer more extinction. The later finding is consistent with work in the literature suggesting SNe Ic tend to occur in regions of on-going star formation. Based on observations collected at Las Campanas Observatory.
Directory of Open Access Journals (Sweden)
Jacobo Pardo-Seco
Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.
Directory of Open Access Journals (Sweden)
Gabriel A. Pinilla Agudelo
2014-01-01
Full Text Available ABSTRACT A methodological proposal for estimating environmental flows in large projects approved by Agencia Nacional de Licencias Ambientales (ANLA in Colombian rivers was developed. The project is the result of an agreement between the MADS and the Universidad Nacional de Colombia, Bogotá (UNC. The proposed method begins with an evaluation of hydrological criteria,continues with a hydraulic and water quality validation, and follows with the determination of habitat integrity. This is an iterative process that compares conditions before and after the project construction and allows to obtain the magnitude of a monthly flow that, besides preserving the ecological functions of the river, guarantees the water uses downstream. Regarding to the biotic component, the proposal includes the establishment and monitoring of biotic integrity indices for four aquatic communities (periphyton, macroinvertebrates, riparian vegetation, and fish. The effects that flow reduction may produce in the medium and long term can be assessed by these indices. We present the results of applying the methodology to several projects licensed by the MADS. RESUMEN Se presenta una propuesta metodológica para estimar los caudales ambientales en grandes proyectos licenciados por la Agencia Nacional de Licencias Ambientales (ANLA de Colombia, resultado de un convenio interadministrativo suscrito entre el ahora Ministerio de Ambiente y Desarrollo Sostenible (MADS de Colombia y la Universidad Nacional de Colombia, Bogotá (UNC. El método propuesto parte de garantizar criterios hidrológicos, continúa con una validación hidráulica y de calidad del agua, sigue con la determinación de la integridad del hábitat, en un proceso iterativo que requiere evaluación para las condiciones antes y después de la construcción del proyecto y que permite establecer un caudal que, además de conservar las funciones ecológicas del río, garantiza los usos del recurso aguas abajo. Espec
Pandit, J J; Andrade, J; Bogod, D G; Hitchman, J M; Jonker, W R; Lucas, N; Mackay, J H; Nimmo, A F; O'Connor, K; O'Sullivan, E P; Paul, R G; Palmer, J H MacG; Plaat, F; Radcliffe, J J; Sury, M R J; Torevell, H E; Wang, M; Hainsworth, J; Cook, T M
2014-10-01
We present the main findings of the 5th National Audit Project on accidental awareness during general anaesthesia. Incidences were estimated using reports of accidental awareness as the numerator, and a parallel national anaesthetic activity survey to provide denominator data. The incidence of certain/probable and possible accidental awareness cases was ~1:19 600 anaesthetics (95% CI 1:16 700-23 450). However, there was considerable variation across subtypes of techniques or subspecialties. The incidence with neuromuscular blockade was ~1:8200 (1:7030-9700), and without it was ~1:135 900 (1:78 600-299 000). The cases of accidental awareness during general anaesthesia reported to 5th National Audit Project were overwhelmingly cases of unintended awareness during neuromuscular blockade. The incidence of accidental awareness during caesarean section was ~1:670 (1:380-1300). Two thirds (82, 66%) of cases of accidental awareness experiences arose in the dynamic phases of anaesthesia, namely induction of and emergence from anaesthesia. During induction of anaesthesia, contributory factors included: use of thiopental; rapid sequence induction; obesity; difficult airway management; neuromuscular blockade; and interruptions of anaesthetic delivery during movement from anaesthetic room to theatre. During emergence from anaesthesia, residual paralysis was perceived by patients as accidental awareness, and commonly related to a failure to ensure full return of motor capacity. One third (43, 33%) of accidental awareness events arose during the maintenance phase of anaesthesia, most due to problems at induction or towards the end of anaesthesia. Factors increasing the risk of accidental awareness included: female sex; age (younger adults, but not children); obesity; anaesthetist seniority (junior trainees); previous awareness; out-of-hours operating; emergencies; type of surgery (obstetric, cardiac, thoracic); and use of neuromuscular blockade. The following factors were
Colby, Sandra L.; Ortman, Jennifer M.
2015-01-01
Between 2014 and 2060, the U.S. population is projected to increase from 319 million to 417 million, reaching 400 million in 2051. The U.S. population is projected to grow more slowly in future decades than in the recent past, as these projections assume that fertility rates will continue to decline and that there will be a modest decline in the…
International Nuclear Information System (INIS)
GOLDMANN, L.H.
1999-01-01
This document is presented to demonstrate the MCOs compliance to the major design criteria invoked on the MCO. This document is broken down into a section for the MCO's evaluation against DOE Order 6430.1A General Design Criteria sixteen divisions and then the evaluation of the MCO against HNF-SD-SNF-DB-005 ''Spent Nuclear Fuel Project Multi-Canister Overpack Additional NRC Requirements.'' The compliance assessment is presented as a matrix in tabular form. The MCO is the primary container for the K-basin's spent nuclear fuel as it leaves the basin pools and through to the 40 year interim storage at the Canister Storage Building (CSB). The MCO and its components interface with; the K basins, shipping cask and transportation system, Cold Vacuum Drying facility individual process bays and equipment, and CSB facility including the MCO handling machine (MHM), the storage tubes, and the MCO work stations where sampling, welding, and inspection of the MCO is performed. As the MCO is the primary boundary for handling, process, and storage, its main goals are to minimize the spread of its radiological contents to the outside of the MCO and provide for nuclear criticality control. The MCO contains personnel radiation shielding only on its upper end, in the form of a shield plug, where the process interfaces are located. Shielding beyond the shield plug is the responsibility of the using facilities. The design of the MCO and its components is depicted in drawings H-2-828040 through H-2-828075. Not every drawing number in the sequence is used. The first drawing number, H-2-828040, is the drawing index for the MCO. The design performance specification for the MCO is HW-S-0426, and was reviewed and approved by the interfacing design authorities, the safety, regulatory, and operations groups, and the local DOE office. The current revision for the design performance specification is revision 5. The designs of the MCO have been reviewed and approved in a similar way and the reports
DEFF Research Database (Denmark)
Thyssen, Jacob Pontoppidan; Uter, Wolfgang; Schnuch, Axel
2007-01-01
case') assumptions were based on patch test reading data in combination with an estimate of the number of persons eligible for patch testing each year based on sales data of the 'standard series'. The estimated 10-year prevalence of contact allergy ranged between 7.3% and 12.9% for adult Danes older...
Directory of Open Access Journals (Sweden)
Bazhenov Viktor Ivanovich
2015-09-01
Full Text Available The starting stage of the tender procedures in Russia with the participation of foreign suppliers dictates the feasibility of the developments for economical methods directed to comparison of technical solutions on the construction field. The article describes the example of practical Life Cycle Cost (LCC evaluations under respect of Present Value (PV determination. These create a possibility for investor to estimate long-term projects (indicated as 25 years as commercially profitable, taking into account inflation rate, interest rate, real discount rate (indicated as 5 %. For economic analysis air-blower station of WWTP was selected as a significant energy consumer. Technical variants for the comparison of blower types are: 1 - multistage without control, 2 - multistage with VFD control, 3 - single stage double vane control. The result of LCC estimation shows the last variant as most attractive or cost-effective for investments with economy of 17,2 % (variant 1 and 21,0 % (variant 2 under adopted duty conditions and evaluations of capital costs (Cic + Cin with annual expenditure related (Ce+Co+Cm. The adopted duty conditions include daily and seasonal fluctuations of air flow. This was the reason for the adopted energy consumption as, kW∙h: 2158 (variant 1,1743...2201 (variant 2, 1058...1951 (variant 3. The article refers to Europump guide tables in order to simplify sophisticated factors search (Cp /Cn, df, which can be useful for economical analyses in Russia. Example of evaluations connected with energy-efficient solutions is given, but this reference involves the use of materials for the cases with resource savings, such as all types of fuel. In conclusion follows the assent to use LCC indicator jointly with the method of determining discounted cash flows, that will satisfy the investor’s need for interest source due to technical and economical comparisons.
Directory of Open Access Journals (Sweden)
Anas Altaleb
2017-03-01
Full Text Available The aim of this work is to synthesize 8*8 substitution boxes (S-boxes for block ciphers. The confusion creating potential of an S-box depends on its construction technique. In the first step, we have applied the algebraic action of the projective general linear group PGL(2,GF(28 on Galois field GF(28. In step 2 we have used the permutations of the symmetric group S256 to construct new kind of S-boxes. To explain the proposed extension scheme, we have given an example and constructed one new S-box. The strength of the extended S-box is computed, and an insight is given to calculate the confusion-creating potency. To analyze the security of the S-box some popular algebraic and statistical attacks are performed as well. The proposed S-box has been analyzed by bit independent criterion, linear approximation probability test, non-linearity test, strict avalanche criterion, differential approximation probability test, and majority logic criterion. A comparison of the proposed S-box with existing S-boxes shows that the analyses of the extended S-box are comparatively better.