Making statistical inferences about software reliability
Miller, Douglas R.
1988-01-01
Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.
Statistical models and methods for reliability and survival analysis
Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo
2013-01-01
Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical
Notes on numerical reliability of several statistical analysis programs
Landwehr, J.M.; Tasker, Gary D.
1999-01-01
This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.
Meat-consumption statistics: reliability and discrepancy
Pål Börjesson
2013-07-01
Full Text Available Interest in meat consumption and its impact on the environment and health has grown markedly over the last few decades and this upsurge has led to greater demand for reliable data. This article aims to describe methods for producing meat-consumption statistics and discuss their limitations and strengths; to identify uncertainties in statistics and to estimate their individual impact; to outline how relevant data are produced and presented at the national (Swedish, regional (Eurostat, and international (FAOSTAT levels; to analyze the consequences of identified discrepancies and uncertainties for estimating the environmental and health effects of meat consumption; and to suggest recommendations for improved production, presentation, and use of meat-consumption statistics. We demonstrate many inconsistencies in how meat-consumption data are produced and presented. Of special importance are assumptions on bone weight, food losses and waste, weight losses during cooking, and nonmeat ingredients. Depending on the methods employed to handle these ambiguous factors, per capita meat-consumption levels may differ by a factor of two or more. This finding illustrates that knowledge concerning limitations, uncertainties, and discrepancies in data is essential for a correct understanding, interpretation, and use of meat-consumption statistics in, for instance, dietary recommendations related to health and environmental issues.
DISTRIBUTED MONITORING SYSTEM RELIABILITY ESTIMATION WITH CONSIDERATION OF STATISTICAL UNCERTAINTY
Yi Pengxing; Yang Shuzi; Du Runsheng; Wu Bo; Liu Shiyuan
2005-01-01
Taking into account the whole system structure and the component reliability estimation uncertainty, a system reliability estimation method based on probability and statistical theory for distributed monitoring systems is presented. The variance and confidence intervals of the system reliability estimation are obtained by expressing system reliability as a linear sum of products of higher order moments of component reliability estimates when the number of component or system survivals obeys binomial distribution. The eigenfunction of binomial distribution is used to determine the moments of component reliability estimates, and a symbolic matrix which can facilitate the search of explicit system reliability estimates is proposed. Furthermore, a case of application is used to illustrate the procedure, and with the help of this example, various issues such as the applicability of this estimation model, and measures to improve system reliability of monitoring systems are discussed.
Statistical analysis on reliability and serviceability of caterpillar tractor
WANG Jinwu; LIU Jiafu; XU Zhongxiang
2007-01-01
For further understanding reliability and serviceability of tractor and to furnish scientific and technical theories, based on the promotion and application of it, the following experiments and statistical analysis on reliability (reliability and MTBF) serviceability (service and MTTR) of Donfanghong-1002 and Dongfanghong-802 were conducted. The result showed that the intervals of average troubles of these two tractors were 182.62 h and 160.2 h, respectively, and the weakest assembly of them was engine part.
Exponential order statistic models of software reliability growth
Miller, D. R.
1986-01-01
Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.
Statistical Analysis of Human Reliability of Armored Equipment
LIU Wei-ping; CAO Wei-guo; REN Jing
2007-01-01
Human errors of seven types of armored equipment, which occur during the course of field test, are statistically analyzed. The human error-to-armored equipment failure ratio is obtained. The causes of human errors are analyzed. The distribution law of human errors is acquired. The ratio of human errors and human reliability index are also calculated.
Monitoring Software Reliability using Statistical Process Control: An MMLE Approach
Bandla Sreenivasa Rao
2011-11-01
Full Text Available This paper consider an MMLE (Modified Maximum Likelihood Estimation based scheme to estimatesoftware reliability using exponential distribution. The MMLE is one of the generalized frameworks ofsoftware reliability models of Non Homogeneous Poisson Processes (NHPPs. The MMLE givesanalytical estimators rather than an iterative approximation to estimate the parameters. In this paper weproposed SPC (Statistical Process Control Charts mechanism to determine the software quality usinginter failure times data. The Control charts can be used to measure whether the software process isstatistically under control or not.
Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test
Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik [Korea Institute of Machinery and Materials, Daejeon (Korea, Republic of)
2015-12-15
A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles.
Statistics Report on TEQSA Registered Higher Education Providers
Australian Government Tertiary Education Quality and Standards Agency, 2015
2015-01-01
This statistics report provides a comprehensive snapshot of national statistics on all parts of the sector for the year 2013, by bringing together data collected directly by TEQSA with data sourced from the main higher education statistics collections managed by the Australian Government Department of Education and Training. The report provides…
Bring Your Own Device - Providing Reliable Model of Data Access
Stąpór Paweł
2016-10-01
Full Text Available The article presents a model of Bring Your Own Device (BYOD as a model network, which provides the user reliable access to network resources. BYOD is a model dynamically developing, which can be applied in many areas. Research network has been launched in order to carry out the test, in which as a service of BYOD model Work Folders service was used. This service allows the user to synchronize files between the device and the server. An access to the network is completed through the wireless communication by the 802.11n standard. Obtained results are shown and analyzed in this article.
Lim, T. J.; Byun, S. S.; Han, S. H.; Lee, H. J.; Lim, J. S.; Oh, S. J.; Park, K. Y.; Song, H. S. [Soongsil Univ., Seoul (Korea)
2001-04-01
This project has been performed in order to construct I and C part reliability databases for detailed analysis of plant protection system and to develop a methodology for analysing trip set point drifts. Reliability database for the I and C parts of plant protection system is required to perform the detailed analysis. First, we have developed an electronic part reliability prediction code based on MIL-HDBK-217F. Then we have collected generic reliability data for the I and C parts in plant protection system. Statistical analysis procedure has been developed to process the data. Then the generic reliability database has been constructed. We have also collected plant specific reliability data for the I and C parts in plant protection system for YGN 3,4 and UCN 3,4 units. Plant specific reliability database for I and C parts has been developed by the Bayesian procedure. We have also developed an statistical analysis procedure for set point drift, and performed analysis of drift effects for trip set point. The basis for the detailed analysis can be provided from the reliability database for the PPS I and C parts. The safety of the KSNP and succeeding NPPs can be proved by reducing the uncertainty of PSA. Economic and efficient operation of NPP can be possible by optimizing the test period to reduce utility's burden. 14 refs., 215 figs., 137 tabs. (Author)
Engineer’s estimate reliability and statistical characteristics of bids
Fariborz M. Tehrani
2016-12-01
Full Text Available The objective of this report is to provide a methodology for examining bids and evaluating the performance of engineer’s estimates in capturing the true cost of projects. This study reviews the cost development for transportation projects in addition to two sources of uncertainties in a cost estimate, including modeling errors and inherent variability. Sample projects are highway maintenance projects with a similar scope of the work, size, and schedule. Statistical analysis of engineering estimates and bids examines the adaptability of statistical models for sample projects. Further, the variation of engineering cost estimates from inception to implementation has been presented and discussed for selected projects. Moreover, the applicability of extreme values theory is assessed for available data. The results indicate that the performance of engineer’s estimate is best evaluated based on trimmed average of bids, excluding discordant bids.
ROLE OF STATISTICAL VIS-A-VIS PHYSICS-OFFAILURE METHODS IN RELIABILITY ENGINEERING
P.V.Varde
2009-01-01
Full Text Available Traditionally the statistical or more specifically probabilistic methods form the basicframework for assessing the reliability characteristics of the components. However the recenttrend for predicting the reliability or life of the component involves application of physics-offailuremethods. This rather new approach is finding wider application as it is based on basicfundamentals of science and thereby provides an improved framework to understand the failuremechanism. Since accelerated testing of component forms part of this approach, the prediction oftime-to-failure of the components is more accurate compared to the existing methods whichdepends only historical data and its evaluation using probabilistic methods. The new approach isall the more relevant when it comes to assessment of reliability of new components as thetraditional probabilistic approach is not adequate to predict reliability of new components as itdepends on historical data for prediction of reliability.In view of the above this paper investigates the role of statistical or probabilisticapproach and physics-of-failure approach for reliability assessment of engineering components ingeneral and electronics components in particular.
PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS
Anna MAZUR
2014-07-01
Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.
The Reliability of Single Subject Statistics for Biofeedback Studies.
Bremner, Frederick J.; And Others
To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…
The Reliability of Single Subject Statistics for Biofeedback Studies.
Bremner, Frederick J.; And Others
To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…
Fault maintenance trees: reliability centered maintenance via statistical model checking
Ruijters, Enno; Guck, Dennis; Drolenga, Peter; Stoelinga, Mariëlle
2016-01-01
The current trend in infrastructural asset management is towards risk-based (a.k.a. reliability centered) maintenance, promising better performance at lower cost. By maintaining crucial components more intensively than less important ones, dependability increases while costs decrease. This requires
Fault maintenance trees: reliability centered maintenance via statistical model checking
Ruijters, Enno Jozef Johannes; Guck, Dennis; Drolenga, Peter; Stoelinga, Mariëlle Ida Antoinette
The current trend in infrastructural asset management is towards risk-based (a.k.a. reliability centered) maintenance, promising better performance at lower cost. By maintaining crucial components more intensively than less important ones, dependability increases while costs decrease. This requires
Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference
Frenkel, Ilia
2012-01-01
Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications. The topics include: concepts and different definitions of signatures (D-spectra), their properties and applications to reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...
Reliable detection of directional couplings using rank statistics.
Chicharro, Daniel; Andrzejak, Ralph G
2009-08-01
To detect directional couplings from time series various measures based on distances in reconstructed state spaces were introduced. These measures can, however, be biased by asymmetries in the dynamics' structure, noise color, or noise level, which are ubiquitous in experimental signals. Using theoretical reasoning and results from model systems we identify the various sources of bias and show that most of them can be eliminated by an appropriate normalization. We furthermore diminish the remaining biases by introducing a measure based on ranks of distances. This rank-based measure outperforms existing distance-based measures concerning both sensitivity and specificity for directional couplings. Therefore, our findings are relevant for a reliable detection of directional couplings from experimental signals.
Reliability Analysis of Production System of Fully-Mechanized Face Based on Output Statistic
CAI Qing-xiang; LI Nai-liang
2005-01-01
Production system of fully-mechanized face is a complicated system composed of human, machine and environment, meantime influenced by various random factors. Analyzing the reliability of system needs plentiful data by means of system faults statistic. Based on the viewpoint that shift output of fully-mechanized face is the result of various random factors' synthetical influence, the process of how to analyze its reliability was deduced by using probability theory, symbolic statistics theory and systematic reliability theory combined with the concrete case study in this paper. And it has been proved that this method is feasible and valuable.
Towards consistent and reliable Dutch and international energy statistics for the chemical industry
Neelis, M.L.; Pouwelse, J.W.
2008-01-01
Consistent and reliable energy statistics are of vital importance for proper monitoring of energy-efficiency policies. In recent studies, irregularities have been reported in the Dutch energy statistics for the chemical industry. We studied in depth the company data that form the basis of the energy
Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz
2015-03-01
FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.
Olivero, Jesús; Márquez, Ana L; Real, Raimundo
2013-01-01
This study uses the amphibian species of the Mediterranean basin to develop a consistent procedure based on fuzzy sets with which biogeographic regions and biotic transition zones can be objectively detected and reliably mapped. Biogeographical regionalizations are abstractions of the geographical organization of life on Earth that provide frameworks for cataloguing species and ecosystems, for answering basic questions in biogeography, evolutionary biology, and systematics, and for assessing priorities for conservation. On the other hand, limits between regions may form sharply defined boundaries along some parts of their borders, whereas elsewhere they may consist of broad transition zones. The fuzzy set approach provides a heuristic way to analyse the complexity of the biota within an area; significantly different regions are detected whose mutual limits are sometimes fuzzy, sometimes clearly crisp. Most of the regionalizations described in the literature for the Mediterranean biogeographical area present a certain degree of convergence when they are compared within the context of fuzzy interpretation, as many of the differences found between regionalizations are located in transition zones, according to our case study. Compared with other classification procedures based on fuzzy sets, the novelty of our method is that both fuzzy logic and statistics are used together in a synergy in order to avoid arbitrary decisions in the definition of biogeographic regions and transition zones.
Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis
Zhou, Z.-G.; Tang, P.; Zhou, M.
2016-06-01
Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.
The Impact of Process Capability on Service Reliability for Critical Infrastructure Providers
Houston, Clemith J., Jr.
2013-01-01
This study investigated the relationship between organizational processes that have been identified as promoting resiliency and their impact on service reliability within the scope of critical infrastructure providers. The importance of critical infrastructure to the nation is evident from the body of research and is supported by instances where…
A statistical test on the reliability of the non-coevality of stars in binary systems
Valle, G.; Dell'Omodarme, M.; Prada Moroni, P. G.; Degl'Innocenti, S.
2016-03-01
comparing the confidence interval of the age of the two stars. We also found that the distribution of W is, for almost all the examined cases, well approximated by beta distributions. Conclusions: The proposed method improves upon the techniques that are commonly adopted for judging the coevality of an observed system. It also provides a result founded on reliable statistics that simultaneously accounts for all the observational uncertainties. On-line calculator available at http://astro.df.unipi.it/stellar-models/W/Full Table 2 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A31
Dars, P.; Ternisien D'Ouville, T.; Mingam, H.; Merckel, G.
1988-01-01
Statistical analysis of asymmetry in LDD NMOSFETs electrical characteristics shows the influence of implantation angles on non-overlap variation observed on devices realized on a 100 mm wafer and within the wafers of a batch . The study of the consequence of this dispersion on the aging behaviour illustrates the importance of this parameter for reliability and the necessity to take it in account for accurate analysis of stress results.
Statistical Degradation Models for Reliability Analysis in Non-Destructive Testing
Chetvertakova, E. S.; Chimitova, E. V.
2017-04-01
In this paper, we consider the application of the statistical degradation models for reliability analysis in non-destructive testing. Such models enable to estimate the reliability function (the dependence of non-failure probability on time) for the fixed critical level using the information of the degradation paths of tested items. The most widely used models are the gamma and Wiener degradation models, in which the gamma or normal distributions are assumed as the distribution of degradation increments, respectively. Using the computer simulation technique, we have analysed the accuracy of the reliability estimates, obtained for considered models. The number of increments can be enlarged by increasing the sample size (the number of tested items) or by increasing the frequency of measuring degradation. It has been shown, that the sample size has a greater influence on the accuracy of the reliability estimates in comparison with the measuring frequency. Moreover, it has been shown that another important factor, influencing the accuracy of reliability estimation, is the duration of observing degradation process.
ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS
Z.-G. Zhou
2016-06-01
Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.
Chu, Tsong-Lun [Brookhaven National Lab. (BNL), Upton, NY (United States); Varuttamaseni, Athi [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, Joo-Seok [Brookhaven National Lab. (BNL), Upton, NY (United States)
2016-11-01
The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).
Statistical analysis of the reliability of complex systems for maintenance planning
Pedersen, Thomas Espelund
2003-01-01
is to analyze failure and maintenance data using mathematical and statistical models in order to improve maintenance procedures in the Danish Defence. The first part of the report introduces the maintenance planning problem and presents an overview of models for reliability, failure processes, and maintenance...... planning. This overview is structured to highlight the process of choosing a proper model for a given data set, focusing on different measures of time and the data requirements for the different models. The second part of the report describes the analysis of two data sets from the Danish Defence. The data...
Statistical reliability and path diversity based PageRank algorithm improvements
Hong, Dohy
2012-01-01
In this paper we present new improvement ideas of the original PageRank algorithm. The first idea is to introduce an evaluation of the statistical reliability of the ranking score of each node based on the local graph property and the second one is to introduce the notion of the path diversity. The path diversity can be exploited to dynamically modify the increment value of each node in the random surfer model or to dynamically adapt the damping factor. We illustrate the impact of such modifications through examples and simple simulations.
Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret;
arithmetic calculations classified nine farms (14.1 %) as resistant and 11 farms (17.2 %) as suspect resistant. Using 10000 Monte Carlo simulated data sets, our methodology provides a reliable classification of farms into different resistance categories with a false discovery rate of 1.02 %. The methodology...... statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...
Time-Variant Reliability Analysis of FPSO Hull Girder Considering Corrosion Based on Statistics
ZHANG Dao-kun; TANG Wen-yong; ZHANG Sheng-kun
2007-01-01
FPSO is a kind of important exploitation platform used in ocean oil and gas industry, which has the unique character of mooring at outsea for a long time. Since it can not be inspected and maintained thoroughly at dock like other kinds of ships, the reliability of FPSO hull girder during the whole service should be focused. Based on latest corrosion database and rational corrosion model, the ultimate strength of one FPSO is calculated under the conditions of slight, moderate and severe corrosion. The results not only provide the reliability under different corrosion conditions, but also do well for further inspection and maintenance research. The results provide necessary foundation for deciding inspection intervals and maintenance measures, which has practical sense to improve the general safety level of ocean engineering.
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun
2009-01-01
New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
Chu, Tsong-Lun [Brookhaven National Lab. (BNL), Upton, NY (United States); Varuttamaseni, Athi [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, Joo-Seok [Brookhaven National Lab. (BNL), Upton, NY (United States)
2016-11-01
The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan1 for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRA's of nuclear power plants (NPPs), and, (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC?s sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware2, development of a philosophical basis for defining software failure3, and identification of desirable attributes of quantitative software reliability methods4 7044. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability.
Amin Salem
2016-03-01
Full Text Available The present investigation provides a detailed relationship between the powder composition and reliability of random ceramic beds. This evaluation is important due to standing in the liquid-gas contactors as well as predicting lifetime. It is still unclear whether the normal distribution is the most suitable function for estimation of failure. By developing the application of ceramic beds in the chemical plants, a special attention has been paid in screening strength distributions. To achieve this goal, an experimental-theoretical study was presented on compressive strength distribution. The powder compositions were prepared according to the statistical response surface methodology and then were formed by a single screw extrusion as Raschig rings. The compressive strength of specimens was measured to evaluate the strength data sets by normal and Weibull distributions. The results were analyzed by the Akaike information criterion and the Anderson-Darling test. The accuracy of distributions in prediction fracture was discussed.
Rafdzah Zaki
2013-06-01
Full Text Available Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice. Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.
Gearbox Reliability Collaborative Update (Presentation)
Sheng, S.; Keller, J.; Glinsky, C.
2013-10-01
This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.
Statistics and Analysis on Reliability of HVDC Transmission Systems of SGCC
Xu
2010-01-01
Reliability level of HVDC power transmission systems becomes an important factor impacting the entire power grid. The author analyzes the reliability of HVDC power transmission systems owned by SGCC since 2003 in respect of forced outage times, forced energy unavailability, scheduled energy unavailability and energy utilization efficiency. The results show that the reliability level of HVDC power transmission systems owned by SGCC is improving. By analyzing different reliability indices of HVDC power transmission system, the maximum asset benefits of power grid can be achieved through building a scientific and reasonable reliability evaluation system.
Crossett, Ben; Edwards, Alistair V G; White, Melanie Y; Cordwell, Stuart J
2008-01-01
Standardized methods for the solubilization of proteins prior to proteomics analyses incorporating two-dimensional gel electrophoresis (2-DE) are essential for providing reproducible data that can be subjected to rigorous statistical interrogation for comparative studies investigating disease-genesis. In this chapter, we discuss the imaging and image analysis of proteins separated by 2-DE, in the context of determining protein abundance alterations related to a change in biochemical or biophysical conditions. We then describe the principles behind 2-DE gel statistical analysis, including subtraction of background noise, spot detection, gel matching, spot quantitation for data comparison, and statistical requirements to create meaningful gel data sets. We also emphasize the need to develop reproducible and robust protocols for protein sample preparation and 2-DE itself.
Kimura, Y. [Positron Medical Center, Tokyo Metropolitan Institute of Gerontology, Naka, Itabashi, Tokyo (Japan); Senda, M. [Foundation for Biomedical Research and Innovation, 7F Chamber of Commerce, Minatojima-Nakamachi, Chuo, Kobe (Japan); Alpert, N.M. [Division of Nuclear Medicine, Massachusetts General Hospital, Boston, MA (United States)]. E-mail: alpert@pet.mgh.harvard.edu
2002-02-07
Formation of parametric images requires voxel-by-voxel estimation of rate constants, a process sensitive to noise and computationally demanding. A model-based clustering method for a two-parameter model (CAKS) was extended to the FDG three-parameter model. The concept was to average voxels with similar kinetic signatures to reduce noise. Voxel kinetics were categorized by the first two principal components of the tissue time-activity curves for all voxels. k{sub 2} and k{sub 3} were estimated cluster-by-cluster, and K{sub 1} was estimated voxel-by-voxel within clusters. When CAKS was applied to simulated images with noise levels similar to brain FDG scans, estimation bias was well suppressed, and estimation errors were substantially smaller - 1.3 times for K{sub i} and 1.5 times for k{sub 3} - than those of conventional voxel-based estimation. The statistical reliability of voxel-level estimation by CAKS was comparable with ROI analysis including 100 voxels. CAKS was applied to clinical cases with Alzheimer's disease (ALZ) and cortico basal degeneration (CBD). In ALZ, the affected regions had low K{sub i}(K{sub 1}k{sub 3}/(k{sub 2}+k{sub 3})) and k{sub 3}. In CBD, K{sub i} was low, but k{sub 3} was preserved. These results were consistent with ROI-based kinetic analysis. Because CAKS decreased the number of invoked estimations, the calculation time was reduced substantially. In conclusion, CAKS has been extended to allow parametric imaging of a three-compartment model. The method is computationally efficient, with low bias and excellent noise properties. (author)
Providing peak river flow statistics and forecasting in the Niger River basin
Andersson, Jafet C. M.; Ali, Abdou; Arheimer, Berit; Gustafsson, David; Minoungou, Bernard
2017-08-01
Flooding is a growing concern in West Africa. Improved quantification of discharge extremes and associated uncertainties is needed to improve infrastructure design, and operational forecasting is needed to provide timely warnings. In this study, we use discharge observations, a hydrological model (Niger-HYPE) and extreme value analysis to estimate peak river flow statistics (e.g. the discharge magnitude with a 100-year return period) across the Niger River basin. To test the model's capacity of predicting peak flows, we compared 30-year maximum discharge and peak flow statistics derived from the model vs. derived from nine observation stations. The results indicate that the model simulates peak discharge reasonably well (on average + 20%). However, the peak flow statistics have a large uncertainty range, which ought to be considered in infrastructure design. We then applied the methodology to derive basin-wide maps of peak flow statistics and their associated uncertainty. The results indicate that the method is applicable across the hydrologically active part of the river basin, and that the uncertainty varies substantially depending on location. Subsequently, we used the most recent bias-corrected climate projections to analyze potential changes in peak flow statistics in a changed climate. The results are generally ambiguous, with consistent changes only in very few areas. To test the forecasting capacity, we ran Niger-HYPE with a combination of meteorological data sets for the 2008 high-flow season and compared with observations. The results indicate reasonable forecasting capacity (on average 17% deviation), but additional years should also be evaluated. We finish by presenting a strategy and pilot project which will develop an operational flood monitoring and forecasting system based in-situ data, earth observations, modelling, and extreme statistics. In this way we aim to build capacity to ultimately improve resilience toward floods, protecting lives and
From eggs to bites: do ovitrap data provide reliable estimates of Aedes albopictus biting females?
Mattia Manica
2017-03-01
probability obtained by introducing these estimates in risk models were similar to those based on females/HLC (R0 > 1 in 86% and 40% of sampling dates for Chikungunya and Zika, respectively; R0 1 for Chikungunya is also to be expected when few/no eggs/day are collected by ovitraps. Discussion This work provides the first evidence of the possibility to predict mean number of adult biting Ae. albopictus females based on mean number of eggs and to compute the threshold of eggs/ovitrap associated to epidemiological risk of arbovirus transmission in the study area. Overall, however, the large confidence intervals in the model predictions represent a caveat regarding the reliability of monitoring schemes based exclusively on ovitrap collections to estimate numbers of biting females and plan control interventions.
From eggs to bites: do ovitrap data provide reliable estimates of Aedes albopictus biting females?
Manica, Mattia; Rosà, Roberto; Della Torre, Alessandra; Caputo, Beniamino
2017-01-01
in risk models were similar to those based on females/HLC (R0 > 1 in 86% and 40% of sampling dates for Chikungunya and Zika, respectively; R0 1 for Chikungunya is also to be expected when few/no eggs/day are collected by ovitraps. This work provides the first evidence of the possibility to predict mean number of adult biting Ae. albopictus females based on mean number of eggs and to compute the threshold of eggs/ovitrap associated to epidemiological risk of arbovirus transmission in the study area. Overall, however, the large confidence intervals in the model predictions represent a caveat regarding the reliability of monitoring schemes based exclusively on ovitrap collections to estimate numbers of biting females and plan control interventions.
From eggs to bites: do ovitrap data provide reliable estimates of Aedes albopictus biting females?
Manica, Mattia; Rosà, Roberto; della Torre, Alessandra
2017-01-01
introducing these estimates in risk models were similar to those based on females/HLC (R0 > 1 in 86% and 40% of sampling dates for Chikungunya and Zika, respectively; R0 1 for Chikungunya is also to be expected when few/no eggs/day are collected by ovitraps. Discussion This work provides the first evidence of the possibility to predict mean number of adult biting Ae. albopictus females based on mean number of eggs and to compute the threshold of eggs/ovitrap associated to epidemiological risk of arbovirus transmission in the study area. Overall, however, the large confidence intervals in the model predictions represent a caveat regarding the reliability of monitoring schemes based exclusively on ovitrap collections to estimate numbers of biting females and plan control interventions. PMID:28321362
A New Approach to Provide Reliable Data Systems Without Using Space-Qualified Electronic Components
Häbel, W.
This paper describes the present situation and the expected trends with regard to the availability of electronic components, their quality levels, technology trends and sensitivity to the space environment. Many recognized vendors have already discontinued their MIL production line and state of the art components will in many cases not be offered in this quality level because of the shrinking market. It becomes therefore obvious that new methods need to be considered "How to build reliable Data Systems for space applications without High-Rel parts". One of the most promising approaches is the identification, masking and suppression of faults by developing Fault Tolerant Computer systems which is described in this paper.
Markerless motion capture can provide reliable 3D gait kinematics in the sagittal and frontal plane.
Sandau, Martin; Koblauch, Henrik; Moeslund, Thomas B; Aanæs, Henrik; Alkjær, Tine; Simonsen, Erik B
2014-09-01
Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose of the present study was to develop a new approach based on highly detailed 3D reconstructions in combination with a translational and rotational unconstrained articulated model. The highly detailed 3D reconstructions were synthesized from an eight camera setup using a stereo vision approach. The subject specific articulated model was generated with three rotational and three translational degrees of freedom for each limb segment and without any constraints to the range of motion. This approach was tested on 3D gait analysis and compared to a marker based method. The experiment included ten healthy subjects in whom hip, knee and ankle joint were analysed. Flexion/extension angles as well as hip abduction/adduction closely resembled those obtained from the marker based system. However, the internal/external rotations, knee abduction/adduction and ankle inversion/eversion were less reliable.
Bohlin, J; Skjerve, E; Ussery, David
2008-01-01
BACKGROUND: The increasing number of sequenced prokaryotic genomes contains a wealth of genomic data that needs to be effectively analysed. A set of statistical tools exists for such analysis, but their strengths and weaknesses have not been fully explored. The statistical methods we are concerned......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... measure was a good measure to detect horizontally transferred regions, and when used to compare the phylogenetic relationships between plasmids and hosts, significant correlation (R2 = 0.4) was found with genomic GC content and intra-chromosomal homogeneity. CONCLUSION: The statistical methods examined...
Spelten, Oliver; Fiedler, Fritz; Schier, Robert; Wetsch, Wolfgang A; Hinkelbein, Jochen
2017-02-01
Hyper or hypoventilation may have serious clinical consequences in critically ill patients and should be generally avoided, especially in neurosurgical patients. Therefore, monitoring of carbon dioxide partial pressure by intermittent arterial blood gas analysis (PaCO2) has become standard in intensive care units (ICUs). However, several additional methods are available to determine PCO2 including end-tidal (PETCO2) and transcutaneous (PTCCO2) measurements. The aim of this study was to compare the accuracy and reliability of different methods to determine PCO2 in mechanically ventilated patients on ICU. After approval of the local ethics committee PCO2 was determined in n = 32 ICU consecutive patients requiring mechanical ventilation: (1) arterial PaCO2 blood gas analysis with Radiometer ABL 625 (ABL; gold standard), (2) arterial PaCO2 analysis with Immediate Response Mobile Analyzer (IRMA), (3) end-tidal PETCO2 by a Propaq 106 EL monitor and (4) transcutaneous PTCCO2 determination by a Tina TCM4. Bland-Altman method was used for statistical analysis; p analysis revealed good correlation between PaCO2 by IRMA and ABL (R(2) = 0.766; p analysis revealed a bias and precision of 2.0 ± 3.7 mmHg for the IRMA, 2.2 ± 5.7 mmHg for transcutaneous, and -5.5 ± 5.6 mmHg for end-tidal measurement. Arterial CO2 partial pressure by IRMA (PaCO2) and PTCCO2 provided greater accuracy compared to the reference measurement (ABL) than the end-tidal CO2 measurements in critically ill in mechanically ventilated patients patients.
Markerless motion capture can provide reliable 3D gait kinematics in the sagittal and frontal plane
Sandau, Martin; Koblauch, Henrik; Moeslund, Thomas B.
2014-01-01
of the present study was to develop a new approach based on highly detailed 3D reconstructions in combination with a translational and rotational unconstrained articulated model. The highly detailed 3D reconstructions were synthesized from an eight camera setup using a stereo vision approach. The subject......Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose...... specific articulated model was generated with three rotational and three translational degrees of freedom for each limb segment and without any constraints to the range of motion. This approach was tested on 3D gait analysis and compared to a marker based method. The experiment included ten healthy...
Williamson, Laura D; Brookes, Kate L; Scott, Beth E; Graham, Isla M; Bradbury, Gareth; Hammond, Philip S; Thompson, Paul M; McPherson, Jana
2016-01-01
...‐based visual surveys. Surveys of cetaceans using acoustic loggers or digital cameras provide alternative methods to estimate relative density that have the potential to reduce cost and provide a verifiable record of all detections...
Thompson, Bruce; Snyder, Patricia A.
1998-01-01
Investigates two aspects of research analyses in quantitative research studies reported in the 1996 issues of "Journal of Counseling & Development" (JCD). Acceptable methodological practice regarding significance testing and evaluation of score reliability has evolved considerably. Contemporary thinking on these issues is described; practice as…
Wolforth, Joan
2012-01-01
This paper discusses issues regarding the validity and reliability of psychoeducational assessments provided to Disability Services Offices at Canadian Universities. Several vignettes illustrate some current issues and the potential consequences when university students are given less than thorough disability evaluations and ascribed diagnoses.…
David Rogosa
2005-04-01
Full Text Available The body of this report consists of a fairly thorough effort to discredit the empirical assertions and methodological prescriptions of Kane and Staiger (KS. The four main sections of content that follow this (lengthy Preamble are:Section 1 Accuracy Of Group SummariesExact results are obtained for the accuracy of grade-level scores (forn=68 which are then compared with the reliability-style calculationsreported in KS for North Carolina data. Also, accuracy properties ofCalifornia API school-level scores are presented, and to compare with KS assertions, the reliability coefficients for these scores are calculated. KS find high volatility even when accuracy is very good, and KS find extreme absence of volatility even when accuracy is moderate to poor.Section 2 Accuracy of ImprovementPrecision of improvement is contrasted with KS-style reliability ofimprovement. Analytic and empirical examples for accuracy of improvement reinforce the basic message: reliability is not precision. Most importantly, precision, which is what matters, can be low, and reliability still be high. And vice versa. Also, school-level California API data display no relation between amount of improvement and uncertainty in the scores (Figures 2.1-2.3, refuting a key KS assertion about school size.Section 3 Persistence of Change.The KS correlation of consecutive changes--and thus the KS estimate of"proportion of variance in changes due to nonpersistent factors"--isshown to be a function of the reliability of the difference score. KSdeterminations of persistence of change are shown to be without valuein accountability systems. Common-sense definitions of consistency ofimprovement and empirical demonstrations using artificial data arepresented.Section 4 California Academic Performance Index Award ProgramsDiscussion of appropriate methods for describing the properties of Award Programs (e.g., determinations of false positive and false negatives are contrasted with the incorrect
Livers provide a reliable matrix for real-time PCR confirmation of avian botulism.
Le Maréchal, Caroline; Ballan, Valentine; Rouxel, Sandra; Bayon-Auboyer, Marie-Hélène; Baudouard, Marie-Agnès; Morvan, Hervé; Houard, Emmanuelle; Poëzevara, Typhaine; Souillard, Rozenn; Woudstra, Cédric; Le Bouquin, Sophie; Fach, Patrick; Chemaly, Marianne
2016-04-01
Diagnosis of avian botulism is based on clinical symptoms, which are indicative but not specific. Laboratory investigations are therefore required to confirm clinical suspicions and establish a definitive diagnosis. Real-time PCR methods have recently been developed for the detection of Clostridium botulinum group III producing type C, D, C/D or D/C toxins. However, no study has been conducted to determine which types of matrices should be analyzed for laboratory confirmation using this approach. This study reports on the comparison of different matrices (pooled intestinal contents, livers, spleens and cloacal swabs) for PCR detection of C. botulinum. Between 2013 and 2015, 63 avian botulism suspicions were tested and 37 were confirmed as botulism. Analysis of livers using real-time PCR after enrichment led to the confirmation of 97% of the botulism outbreaks. Using the same method, spleens led to the confirmation of 90% of botulism outbreaks, cloacal swabs of 93% and pooled intestinal contents of 46%. Liver appears to be the most reliable type of matrix for laboratory confirmation using real-time PCR analysis.
Sørensen, John Dalsgaard; Enevoldsen, I.
1993-01-01
a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....
Kozine, Igor; Krymsky, V.G.
2009-01-01
of additional information can be incorporated to make the bounds tighter? The present paper gives an account of the source of this imprecision that prevents interval-valued statistical models from being widely applied. Firstly, the mathematical approach to building interval-valued models (discrete......The application of interval-valued statistical models is often hindered by the rapid growth in imprecision that occurs when intervals are propagated through models. Is this deficiency inherent in the models? If so, what is the underlying cause of imprecision in mathematical terms? What kind...
Pearson, E.; Smith, M. W.; Klaar, M. J.; Brown, L. E.
2017-09-01
High resolution topographic surveys such as those provided by Structure-from-Motion (SfM) contain a wealth of information that is not always exploited in the generation of Digital Elevation Models (DEMs). In particular, several authors have related sub-metre scale topographic variability (or 'surface roughness') to sediment grain size by deriving empirical relationships between the two. In fluvial applications, such relationships permit rapid analysis of the spatial distribution of grain size over entire river reaches, providing improved data to drive three-dimensional hydraulic models, allowing rapid geomorphic monitoring of sub-reach river restoration projects, and enabling more robust characterisation of riverbed habitats. However, comparison of previously published roughness-grain-size relationships shows substantial variability between field sites. Using a combination of over 300 laboratory and field-based SfM surveys, we demonstrate the influence of inherent survey error, irregularity of natural gravels, particle shape, grain packing structure, sorting, and form roughness on roughness-grain-size relationships. Roughness analysis from SfM datasets can accurately predict the diameter of smooth hemispheres, though natural, irregular gravels result in a higher roughness value for a given diameter and different grain shapes yield different relationships. A suite of empirical relationships is presented as a decision tree which improves predictions of grain size. By accounting for differences in patch facies, large improvements in D50 prediction are possible. SfM is capable of providing accurate grain size estimates, although further refinement is needed for poorly sorted gravel patches, for which c-axis percentiles are better predicted than b-axis percentiles.
Tomislav Car
1989-12-01
Full Text Available Statistical approach is one of methods offering us some quantitative indications of structure security and basic techno-economics criterion of their applications. The basis of that approach is determination of the probability of error occurrence in different interactive relationship between natural forces and structure properties. Results of that analyse are the basis for quantitative definition of the techno-economics optimum for structure and system application (the paper is published in Croatian.
A statistical test on the reliability of the non-coevality of stars in binary systems
Valle, G; Moroni, P G Prada; Degl'Innocenti, S
2016-01-01
We develop a statistical test on the expected difference in age estimates of two coeval stars in detached double-lined eclipsing binary systems that are only caused by observational uncertainties. We focus on stars in the mass range [0.8; 1.6] Msun, and on stars in the main-sequence phase. The ages were obtained by means of the maximum-likelihood SCEPtER technique. The observational constraints used in the recovery procedure are stellar mass, radius, effective temperature, and metallicity [Fe/H]. We defined the statistic W computed as the ratio of the absolute difference of estimated ages for the two stars over the age of the older one. We determined the critical values of this statistics above which coevality can be rejected. The median expected difference in the reconstructed age between the coeval stars of a binary system -- caused alone by the observational uncertainties -- shows a strong dependence on the evolutionary stage. This ranges from about 20% for an evolved primary star to about 75% for a near Z...
Matusiewicz, Alexis K; Carter, Anne E; Landes, Reid D; Yi, Richard
2013-11-01
Delay discounting (DD) and probability discounting (PD) refer to the reduction in the subjective value of outcomes as a function of delay and uncertainty, respectively. Elevated measures of discounting are associated with a variety of maladaptive behaviors, and confidence in the validity of these measures is imperative. The present research examined (1) the statistical equivalence of discounting measures when rewards were hypothetical or real, and (2) their 1-week reliability. While previous research has partially explored these issues using the low threshold of nonsignificant difference, the present study fully addressed this issue using the more-compelling threshold of statistical equivalence. DD and PD measures were collected from 28 healthy adults using real and hypothetical $50 rewards during each of two experimental sessions, one week apart. Analyses using area-under-the-curve measures revealed a general pattern of statistical equivalence, indicating equivalence of real/hypothetical conditions as well as 1-week reliability. Exceptions are identified and discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Improving the statistical reliability of stream heat assimilation prediction. Final report
McLay, R.W.; Hundal, M.S.; Lamborn, K.R.
1975-06-01
In response to an increased interest in water quality by the public, a large effort has been mounted to develop mathematical models for predicting heat assimilation in bodies of water. The accuracy of these models has recently come under scrutiny due to the need for temperature predictions within 1C of the ambient. This work is an evaluation of existing, one-dimensional stream temperature prediction techniques for accuracy and precision. The approach is through error estimates on a general model that encompasses all of the models presently used. A sensitivity analysis of this general model is used in conjunction with statistical methods to determine the solution errors. (GRA)
The reliability of fishing statistics as a source for catches and fish stocks in antiquity
Jacobsen, A. Lif Lund
2005-01-01
fishery data to estimate the productivity of ancient fisheries. Unfortunately his work suffered from several severe misunderstandings about ecosystems, the nature of a fishery and its biological interaction with its environment. The purpose of this paper is to discuss the statistical background...... for Gallant’s conclusions about fishery and the usefulness of modern catch data for historical fishery research. In order to do so, the author adopts the viewpoint of marine-environmental history, with some reference to other authors’ work on ancient fisheries....
The reliability of fishing statistics as a source for catches and fish stocks in antiquity
Jacobsen, A. Lif Lund
2005-01-01
In 1985, T.W. Gallant published an influential essay on the potential productivity of fishing in the ancient world. He concluded that: ”the role of fishing in the diet and economy would have been, on the whole, subordinate and supplementary…” His methodological approach was original in using modern...... fishery data to estimate the productivity of ancient fisheries. Unfortunately his work suffered from several severe misunderstandings about ecosystems, the nature of a fishery and its biological interaction with its environment. The purpose of this paper is to discuss the statistical background...
Harrou, Fouzi
2017-09-18
This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.
Statistical Physics Methods Provide the Exact Solution to a Long-Standing Problem of Genetics.
Samal, Areejit; Martin, Olivier C
2015-06-12
Analytic and computational methods developed within statistical physics have found applications in numerous disciplines. In this Letter, we use such methods to solve a long-standing problem in statistical genetics. The problem, posed by Haldane and Waddington [Genetics 16, 357 (1931)], concerns so-called recombinant inbred lines (RILs) produced by repeated inbreeding. Haldane and Waddington derived the probabilities of RILs when considering two and three genes but the case of four or more genes has remained elusive. Our solution uses two probabilistic frameworks relatively unknown outside of physics: Glauber's formula and self-consistent equations of the Schwinger-Dyson type. Surprisingly, this combination of statistical formalisms unveils the exact probabilities of RILs for any number of genes. Extensions of the framework may have applications in population genetics and beyond.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Accounting providing of statistical analysis of intangible assets renewal under marketing strategy
I.R. Polishchuk
2016-12-01
Full Text Available The article analyzes the content of the Regulations on accounting policies of the surveyed enterprises in terms of the operations concerning the amortization of intangible assets on the following criteria: assessment on admission, determination of useful life, the period of depreciation, residual value, depreciation method, reflection in the financial statements, a unit of account, revaluation, formation of fair value. The characteristic of factors affecting the accounting policies and determining the mechanism for evaluating the completeness and timeliness of intangible assets renewal is showed. The algorithm for selecting the method of intangible assets amortization is proposed. The knowledge base of statistical analysis of timeliness and completeness of intangible assets renewal in terms of the developed internal reporting is expanded. The statistical indicators to assess the effectiveness of the amortization policy for intangible assets are proposed. The marketing strategies depending on the condition and amount of intangible assets in relation to increasing marketing potential for continuity of economic activity are described.
McShane, Blakeley B; 10.1214/10-AOAS398
2011-01-01
Predicting historic temperatures based on tree rings, ice cores, and other natural proxies is a difficult endeavor. The relationship between proxies and temperature is weak and the number of proxies is far larger than the number of target data points. Furthermore, the data contain complex spatial and temporal dependence structures which are not easily captured with simple models. In this paper, we assess the reliability of such reconstructions and their statistical significance against various null models. We find that the proxies do not predict temperature significantly better than random series generated independently of temperature. Furthermore, various model specifications that perform similarly at predicting temperature produce extremely different historical backcasts. Finally, the proxies seem unable to forecast the high levels of and sharp run-up in temperature in the 1990s either in-sample or from contiguous holdout blocks, thus casting doubt on their ability to predict such phenomena if in fact they ...
Assuring reliability program effectiveness.
Ball, L. W.
1973-01-01
An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Stern, David L; Clemens, Jan; Coen, Philip; Calhoun, Adam J; Hogenesch, John B; Arthur, Ben J; Murthy, Mala
2017-09-12
From 1980 to 1992, a series of influential papers reported on the discovery, genetics, and evolution of a periodic cycling of the interval between Drosophila male courtship song pulses. The molecular mechanisms underlying this periodicity were never described. To reinitiate investigation of this phenomenon, we previously performed automated segmentation of songs but failed to detect the proposed rhythm [Arthur BJ, et al. (2013) BMC Biol 11:11; Stern DL (2014) BMC Biol 12:38]. Kyriacou et al. [Kyriacou CP, et al. (2017) Proc Natl Acad Sci USA 114:1970-1975] report that we failed to detect song rhythms because (i) our flies did not sing enough and (ii) our segmenter did not identify many of the song pulses. Kyriacou et al. manually annotated a subset of our recordings and reported that two strains displayed rhythms with genotype-specific periodicity, in agreement with their original reports. We cannot replicate this finding and show that the manually annotated data, the original automatically segmented data, and a new dataset provide no evidence for either the existence of song rhythms or song periodicity differences between genotypes. Furthermore, we have reexamined our methods and analysis and find that our automated segmentation method was not biased to prevent detection of putative song periodicity. We conclude that there is no evidence for the existence of Drosophila courtship song rhythms.
Starke, Michael R [ORNL; Kirby, Brendan J [ORNL; Kueck, John D [ORNL; Todd, Duane [Alcoa; Caulfield, Michael [Alcoa; Helms, Brian [Alcoa
2009-02-01
Demand response is the largest underutilized reliability resource in North America. Historic demand response programs have focused on reducing overall electricity consumption (increasing efficiency) and shaving peaks but have not typically been used for immediate reliability response. Many of these programs have been successful but demand response remains a limited resource. The Federal Energy Regulatory Commission (FERC) report, 'Assessment of Demand Response and Advanced Metering' (FERC 2006) found that only five percent of customers are on some form of demand response program. Collectively they represent an estimated 37,000 MW of response potential. These programs reduce overall energy consumption, lower green house gas emissions by allowing fossil fuel generators to operate at increased efficiency and reduce stress on the power system during periods of peak loading. As the country continues to restructure energy markets with sophisticated marginal cost models that attempt to minimize total energy costs, the ability of demand response to create meaningful shifts in the supply and demand equations is critical to creating a sustainable and balanced economic response to energy issues. Restructured energy market prices are set by the cost of the next incremental unit of energy, so that as additional generation is brought into the market, the cost for the entire market increases. The benefit of demand response is that it reduces overall demand and shifts the entire market to a lower pricing level. This can be very effective in mitigating price volatility or scarcity pricing as the power system responds to changing demand schedules, loss of large generators, or loss of transmission. As a global producer of alumina, primary aluminum, and fabricated aluminum products, Alcoa Inc., has the capability to provide demand response services through its manufacturing facilities and uniquely through its aluminum smelting facilities. For a typical aluminum smelter
EMU test operation reliability statistics and evaluation%动车组试运营可靠性数据统计与评估
李庭芳; 李敬雅; 夏丹锋; 周汛
2013-01-01
介绍了动车组试运营可靠性数据的来源、故障等级和可靠性指标的定义，通过开展可靠性统计和评估，发现产品的薄弱环节，再经过采取可靠性闭合管理，最终实现了可靠性的目标。%This paper introduces the EMU test operation reliability data source ,fault level and reliability index is defined ,by carrying out the reliability statistics and evaluation ,found product of the weak links ,and then by taking the reliability of closed management ,fi-nally achieve the reliability target .
Islam, M Mofizul; Topp, Libby; Conigrave, Katherine M; van Beek, Ingrid; Maher, Lisa; White, Ann; Rodgers, Craig; Day, Carolyn A
2012-01-01
Research with injecting drug users (IDUs) suggests greater willingness to report sensitive and stigmatised behaviour via audio computer-assisted self-interviewing (ACASI) methods than during face-to-face interviews (FFIs); however, previous studies were limited in verifying this within the same individuals at the same time point. This study examines the relative willingness of IDUs to report sensitive information via ACASI and during a face-to-face clinical assessment administered in health services for IDUs. During recruitment for a randomised controlled trial undertaken at two IDU-targeted health services, assessments were undertaken as per clinical protocols, followed by referral of eligible clients to the trial, in which baseline self-report data were collected via ACASI. Five questions about sensitive injecting and sexual risk behaviours were administered to participants during both clinical interviews and baseline research data collection. "Percentage agreement" determined the magnitude of concordance/discordance in responses across interview methods, while tests appropriate to data format assessed the statistical significance of this variation. Results for all five variables suggest that, relative to ACASI, FFI elicited responses that may be perceived as more socially desirable. Discordance was statistically significant for four of the five variables examined. Participants who reported a history of sex work were more likely to provide discordant responses to at least one socially sensitive item. In health services for IDUs, information collection via ACASI may elicit more reliable and valid responses than FFI. Adoption of a universal precautionary approach to complement individually tailored assessment of and advice regarding health risk behaviours for IDUs may address this issue.
Haebel, Wolfgang
2004-08-01
This paper describes the present situation and the expected trends with regard to the availability of electronic components, their quality levels, technology trends and sensitivity to the space environment. Many recognized vendors have already discontinued their MIL production line and state of the art components will in many cases not be offered in this quality level because of the shrinking market. It becomes therefore obvious that new methods need to be considered "How to build reliable Data Systems for space applications without High-Rel parts". One of the most promising approaches is the identification, masking and suppression of faults by developing fault-tolerant computer systems which is described in this paper.
Antczak, K; Wilczyńska, U
1980-01-01
Part II presents a statistical model devised by the authors for evaluating toxicological analyses results. The model includes: 1. Establishment of a reference value, basing on our own measurements taken by two independent analytical methods. 2. Selection of laboratories -- basing on the deviation of the obtained values from reference ones. 3. On consideration of variance analysis, t-student's test and differences test, subsequent quality controls and particular laboratories have been evaluated.
Orton, Larry
2009-01-01
This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…
Bendell, A
1986-01-01
Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo
Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret
statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL...... was shown to be unaffected by single outlier horses on the farms, while traditional calculations were strongly biased. The statistical model combines information between farms to distinguish between variability and genuine reduction in efficacy and can be adapted to handle FECRT data obtained from other......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...
Hartzell, Allyson L; Shea, Herbert R
2010-01-01
This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.
Suzuki, Toshiyuki [Department of Surgery, Tokai University School of Medicine, Kanagawa (Japan); Sadahiro, Sotaro, E-mail: sadahiro@is.icc.u-tokai.ac.jp [Department of Surgery, Tokai University School of Medicine, Kanagawa (Japan); Tanaka, Akira; Okada, Kazutake; Kamata, Hiroko; Kamijo, Akemi [Department of Surgery, Tokai University School of Medicine, Kanagawa (Japan); Murayama, Chieko [Department of Clinical Pharmacology, Tokai University School of Medicine, Kanagawa (Japan); Akiba, Takeshi; Kawada, Shuichi [Department of Radiology, Tokai University School of Medicine, Kanagawa (Japan)
2013-04-01
Purpose: Preoperative chemoradiation therapy (CRT) significantly decreases local recurrence in locally advanced rectal cancer. Various biomarkers in biopsy specimens obtained before CRT have been proposed as predictors of response. However, reliable biomarkers remain to be established. Methods and Materials: The study group comprised 101 consecutive patients with locally advanced rectal cancer who received preoperative CRT with oral uracil/tegafur (UFT) or S-1. We evaluated histologic findings on hematoxylin and eosin (H and E) staining and immunohistochemical expressions of Ki67, p53, p21, and apoptosis in biopsy specimens obtained before CRT and 7 days after starting CRT. These findings were contrasted with the histologic response and the degree of tumor shrinkage. Results: In biopsy specimens obtained before CRT, histologic marked regression according to the Japanese Classification of Colorectal Carcinoma (JCCC) criteria and the degree of tumor shrinkage on barium enema examination (BE) were significantly greater in patients with p21-positive tumors than in those with p21-negative tumors (P=.04 and P<.01, respectively). In biopsy specimens obtained 7 days after starting CRT, pathologic complete response, histologic marked regression according to both the tumor regression criteria and JCCC criteria, and T downstaging were significantly greater in patients with apoptosis-positive and p21-positive tumors than in those with apoptosis-negative (P<.01, P=.02, P=.01, and P<.01, respectively) or p21-negative tumors (P=.03, P<.01, P<.01, and P=.02, respectively). The degree of tumor shrinkage on both BE as well as MRI was significantly greater in patients with apoptosis-positive and with p21-positive tumors than in those with apoptosis-negative or p21-negative tumors, respectively. Histologic changes in H and E-stained biopsy specimens 7 days after starting CRT significantly correlated with pathologic complete response and marked regression on both JCCC and tumor
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-06-01
Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.
Electronics reliability calculation and design
Dummer, Geoffrey W A; Hiller, N
1966-01-01
Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea
Höing, Andrea; Quinten, Marcel C; Indrawati, Yohana Maria; Cheyne, Susan M; Waltert, Matthias
2013-02-01
Estimating population densities of key species is crucial for many conservation programs. Density estimates provide baseline data and enable monitoring of population size. Several different survey methods are available, and the choice of method depends on the species and study aims. Few studies have compared the accuracy and efficiency of different survey methods for large mammals, particularly for primates. Here we compare estimates of density and abundance of Kloss' gibbons (Hylobates klossii) using two of the most common survey methods: line transect distance sampling and triangulation. Line transect surveys (survey effort: 155.5 km) produced a total of 101 auditory and visual encounters and a density estimate of 5.5 gibbon clusters (groups or subgroups of primate social units)/km(2). Triangulation conducted from 12 listening posts during the same period revealed a similar density estimate of 5.0 clusters/km(2). Coefficients of variation of cluster density estimates were slightly higher from triangulation (0.24) than from line transects (0.17), resulting in a lack of precision in detecting changes in cluster densities of triangulation and triangulation method also may be appropriate.
Bahuguna, Rajeev Nayan; Joshi, Rohit; Shukla, Alok; Pandey, Mayank; Kumar, J
2012-08-01
A novel pathogen defense strategy by thiamine priming was evaluated for its efficacy against sheath blight pathogen, Rhizoctonia solani AG-1A, of rice and compared with that of systemic fungicide, carbendazim (BCM). Seeds of semidwarf, high yielding, basmati rice variety Vasumati were treated with thiamine (50 mM) and BCM (4 mM). The pot cultured plants were challenge inoculated with R. solani after 40 days of sowing and effect of thiamine and BCM on rice growth and yield traits was examined. Higher hydrogen peroxide content, total phenolics accumulation, phenylalanine ammonia lyase (PAL) activity and superoxide dismutase (SOD) activity under thiamine treatment displayed elevated level of systemic resistance, which was further augmented under challenging pathogen infection. High transcript level of phenylalanine ammonia lyase (PAL) and manganese superoxide dismutase (MnSOD) validated mode of thiamine primed defense. Though minimum disease severity was observed under BCM treatment, thiamine produced comparable results, with 18.12 per cent lower efficacy. Along with fortifying defense components and minor influence on photosynthetic pigments and nitrate reductase (NR) activity, thiamine treatment significantly reduced pathogen-induced loss in photosynthesis, stomatal conductance, chlorophyll fluorescence, NR activity and NR transcript level. Physiological traits affected under pathogen infection were found signatory for characterizing plant's response under disease and were detectable at early stage of infection. These findings provide a novel paradigm for developing alternative, environmentally safe strategies to control plant diseases.
陈丽娟; 李霞
2012-01-01
According to the statistics on reliability indices of nation-wide 220 kV and the above power transmission and transformation facilities in 2010, especially operation data of three main kinds of power transmission and transformation facilities, namely transformer, circuit breaker and overhead line, the main influencing causes of scheduled and unscheduled outages were analyzed and evaluated. Through the analysis of reliability indices in recent years, the reliability changing tendency was discussed, which provided the beneficial basis for improving the reliability of power transmission and transformation facilities in the coming years.%通过对2011年全国220 kV及以上电压等级输变电设施可靠性指标的分析,特别是对影响变压器、断路器、架空线路3类主要输变电设施可靠性的计划停运、非计划停运的主要因素及目前在运的特高压输电线路的分析与评估,提出了影响可靠性因素的变化趋势,为电力系统规划、设计制造、设备选型、安装调试、生产运行、检修维护等各环节工作提供了技术支撑.
Cho, Hyeyoung; Kim, Hyosil; Na, Dokyun; Kim, So Youn; Jo, Deokyeon; Lee, Doheon
2016-03-04
Biomarkers that are identified from a single study often appear to be biologically irrelevant or false positives. Meta-analysis techniques allow integrating data from multiple studies that are related but independent in order to identify biomarkers across multiple conditions. However, existing biomarker meta-analysis methods tend to be sensitive to the dataset being analyzed. Here, we propose a meta-analysis method, iMeta, which integrates t-statistic and fold change ratio for improved robustness. For evaluation of predictive performance of the biomarkers identified by iMeta, we compare our method with other meta-analysis methods. As a result, iMeta outperforms the other methods in terms of sensitivity and specificity, and especially shows robustness to study variance increase; it consistently shows higher classification accuracy on diverse datasets, while the performance of the others is highly affected by the dataset being analyzed. Application of iMeta to 59 drug-induced liver injury studies identified three key biomarker genes: Zwint, Abcc3, and Ppp1r3b. Experimental evaluation using RT-PCR and qRT-PCR shows that their expressional changes in response to drug toxicity are concordant with the result of our method. iMeta is available at http://imeta.kaist.ac.kr/index.html.
Smith, Dianna M; Pearce, Jamie R; Harland, Kirk
2011-03-01
Models created to estimate neighbourhood level health outcomes and behaviours can be difficult to validate as prevalence is often unknown at the local level. This paper tests the reliability of a spatial microsimulation model, using a deterministic reweighting method, to predict smoking prevalence in small areas across New Zealand. The difference in the prevalence of smoking between those estimated by the model and those calculated from census data is less than 20% in 1745 out of 1760 areas. The accuracy of these results provides users with greater confidence to utilize similar approaches in countries where local-level smoking prevalence is unknown.
Reliability Considerations for the Operation of Large Accelerator User Facilities
Willeke, F J
2016-01-01
The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. The article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.
Weller, G.H.
2001-07-15
Utility load management programs--including direct load control and interruptible load programs--were employed by utilities in the past as system reliability resources. With electricity industry restructuring, the context for these programs has changed; the market that was once controlled by vertically integrated utilities has become competitive, raising the question: can existing load management programs be modified so that they can effectively participate in competitive energy markets? In the short run, modified and/or improved operation of load management programs may be the most effective form of demand-side response available to the electricity system today. However, in light of recent technological advances in metering, communication, and load control, utility load management programs must be carefully reviewed in order to determine appropriate investments to support this transition. This report investigates the feasibility of and options for modifying an existing utility load management system so that it might provide reliability services (i.e. ancillary services) in the competitive markets that have resulted from electricity industry restructuring. The report is a case study of Southern California Edison's (SCE) load management programs. SCE was chosen because it operates one of the largest load management programs in the country and it operates them within a competitive wholesale electricity market. The report describes a wide range of existing and soon-to-be-available communication, control, and metering technologies that could be used to facilitate the evolution of SCE's load management programs and systems to provision of reliability services. The fundamental finding of this report is that, with modifications, SCE's load management infrastructure could be transitioned to provide critical ancillary services in competitive electricity markets, employing currently or soon-to-be available load control technologies.
Osler, James Edward, II
2015-01-01
This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…
Sahin, Sevnaz; Mandiracioglu, Aliye; Tekin, Nil; Senuzun, Fisun; Akcicek, Fehmi
2012-01-01
The population of above 65 years of age is increasing fast in societies, as the life expectancy is increasing and it leads to high demands for health care service. Health care service for the elderly should be provided by trained team in this field. Success of health care service to be rendered is related to knowledge, skill and attitudes of team members in different profession group (doctor, nurse, social worker, psychologist, etc.) about health of elderly. The aim of this study is to establish the Turkish validity and reliability of 14-question UCLA-GA scale, validity and reliability of which was proven and used the most frequently among the scales that assess attitudes of health care providers toward elderly. A total 256 people, 150 of them were post-graduates, 106 of them were pre-graduates were involved in the study at Ege University, medical faculty between the dates of December 2010 and February 2011. Majority of the participants (63.67%) were women and in the age group of 18-29 (58.3%). The ratio of the ones undergoing geriatric education is 38.2%. It was found out that the Kaiser-Meyer-Olkin (KMO) sampling adequacy test presented high correlation among the items in both single adult households of 14 items of the scale was 0.72. Cronbach alpha value of the scale was found as 0.67 and satisfying. As a result of examination with Tukey's test of additivity, it was seen that items of the scale have additive quality (F=85.25, pattitudes of health care providers toward elderly in geriatrics.
陈丽娟; 胡小正
2011-01-01
According to the statistics on reliability indices of nation-wide 220 kV and the above power transmission and transformation facilities in 2010, especially operation data of three main kinds of power transmission and transformation facilities, namely transformer, circuit breaker and overhead line, the main influencing causes of scheduled and unscheduled outages were analyzed and evaluated. Through the analysis of reliability indices in recent years, the reliability changing tendency was discussed, which provided the beneficial basis for improving the reliability of power transmission and transformation facilities in the coming years.%通过对2010年全国220 kV及以上电压等级输变电设施可靠性分析,特别是对220kV、330 kV、500kV变压器、断路器、架空线路3类主要输变电设施的计划停运及非计划停运方面的分析,从技术及责任原因中找出影响其运行可靠性的薄弱环节,并通过设备型式、部件分析、进口与国产设备指标对比,找出了可靠性的变化趋势,为规划、设计、施工、生产等各环节电力企业制定确保输变电设施可靠性有效措施提供参考依据.
Zakir, Ali; Bengtsson, Marie; Sadek, Medhat M; Hansson, Bill S; Witzgall, Peter; Anderson, Peter
2013-09-01
Animals depend on reliable sensory information for accurate behavioural decisions. For herbivorous insects it is crucial to find host plants for feeding and reproduction, and these insects must be able to differentiate suitable from unsuitable plants. Volatiles are important cues for insect herbivores to assess host plant quality. It has previously been shown that female moths of the Egyptian cotton leafworm, Spodoptera littoralis (Lepidoptera: Noctuidae), avoid oviposition on damaged cotton Gossypium hirsutum, which may mediated by herbivore-induced plant volatiles (HIPVs). Among the HIPVs, some volatiles are released following any type of damage while others are synthesized de novo and released by the plants only in response to herbivore damage. In behavioural experiments we here show that oviposition by S. littoralis on undamaged cotton plants was reduced by adding volatiles collected from plants with ongoing herbivory. Gas chromatography-electroantennographic detection (GC-EAD) recordings revealed that antennae of mated S. littoralis females responded to 18 compounds from a collection of headspace volatiles of damaged cotton plants. Among these compounds, a blend of the seven de novo synthesized volatile compounds was found to reduce oviposition in S. littoralis on undamaged plants under both laboratory and ambient (field) conditions in Egypt. Volatile compounds that are not produced de novo by the plants did not affect oviposition. Our results show that ovipositing females respond specifically to the de novo synthesized volatiles released from plants under herbivore attack. We suggest that these volatiles provide reliable cues for ovipositing females to detect plants that could provide reduced quality food for their offspring and an increased risk of competition and predation.
Sathyachandran, S. K.; Roy, D. P.; Boschetti, L.
2014-12-01
The Fire Radiative Power (FRP) [MW] is a measure of the rate of biomass combustion and can be retrieved from ground based and satellite observations using middle infra-red measurements. The temporal integral of FRP is the Fire Radiative Energy (FRE) [MJ] and is related linearly to the total biomass consumption and so pyrogenic emissions. Satellite derived biomass consumption and emissions estimates have been derived conventionally by computing the summed total FRP, or the average FRP (arithmetic average of FRP retrievals), over spatial geographic grids for fixed time periods. These two methods are prone to estimation bias, especially under irregular sampling conditions such as provided by polar-orbiting satellites, because the FRP can vary rapidly in space and time as a function of the fire behavior. Linear temporal integration of FRP taking into account when the FRP values were observed and using the trapezoidal rule for numerical integration has been suggested as an alternate FRE estimation method. In this study FRP data measured rapidly with a dual-band radiometer over eight prescribed fires are used to compute eight FRE values using the sum, mean and trapezoidal estimation approaches under a variety of simulated irregular sampling conditions. The estimated values are compared to biomass consumed measurements for each of the eight fires to provide insights into which method provides more accurate and precise biomass consumption estimates. The three methods are also applied to continental MODIS FRP data to study their differences using polar orbiting satellite data. The research findings indicate that trapezoidal FRP numerical integration provides the most reliable estimator.
Radulović, Niko S; Blagojević, Polina D
2013-08-02
Plant volatiles have been repeatedly shown to provide valuable insight into the evolutionary relationships among plant taxa on various taxonomical levels. The number of variables available from GC-MS analyses of these plant metabolites usually represents a large data set. The comparison of such data sets requires the use of multivariate statistical analyses (MSA) but with several serious shortcomings. In order to make multivariate statistical comparison of essential oils more applicable, reliable and faster, this work was set to explore the suitability of a complementary use of relative abundances of m/z values of the average mass scan of the total GC chromatograms instead of the traditionally used variables-percentages (peak areas) of individual oil constituents. To achieve this, essential oils extracted from 12 different Artemisia species were analyzed using GC-FID and GC-MS. Almost 500 different constituents were successfully identified. Average mass scans of the total GC chromatograms (AMS) and chemical compositions (relative percentages) of the analyzed oils were separately compared using two MSA methods: agglomerative hierarchical cluster analysis and principal component analysis. This approach was applied to an additional set of essential oil compositional data (representatives of a number of different genera/families; data from the literature) using both types of variables. The obtained results strongly suggest that MSA of complex volatile mixtures, using the corresponding directly obtainable AMS, could be considered as a promising time saving tool for easy and reliable comparison purposes. The AMS approach gives comparable or even better results than the traditional method - it reflected the natural relationships between observations within both studied groups of oils.
The rating reliability calculator
Solomon David J
2004-04-01
Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.
Application of a truncated normal failure distribution in reliability testing
Groves, C., Jr.
1968-01-01
Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.
Statistical methods of economic and reliability benefits in outage management%停电管理的经济效益与可靠性效益统计方法
杜力; 黎灿兵; 熊浩清; 杜亮; 程子霞; 曹丽华
2011-01-01
科学的停电管理可以取得较好的经济效益和可靠性效益.分析了停电管理效益的宏观统计方法,分别用增供电量和停电时户数描述经济效益和可靠性效益.研究了停电管理中实现增供扩销的措施,从微观角度分析增供扩销效果的统计方法.最后提出停电管理中评估经济效益与可靠性效益的指标,为管理层提供停电管理决策支持,同时更加准确地进行业绩考核,提高供电企业的精细化管理水平,实现供需双赢.实践证明,所提出的方法具有较强的实用性.%Scientific outage management can achieve great economic benefits and reliability benefits. The macro-statistic method of outage management benefit is analyzed. The economic and reliability benefits are described by increased electricity sales and number of outage hour-households respectively. The measures that can realize supply increase and marketing enlarging are studied and the practical effect of these statistical methods are analyzed from microcosmic angle. Finally, the indicators of economic benefits and reliability benefits in outage management are put forward. These indicators can provide outage management decision support for management layer, and also achieve more accurate performance assessment, which can improve the fine management level of power supply enterprise, and also achieve benefit for both supply enterprise and energy consumer. It is proved that the methods proposed are practical.This work is supported by the National High Technology Research and Development Program of China (863 Program) (No. SQ2010AA0500165004).
Wallace, Lorraine S; Chisolm, Deena J; Abdel-Rasoul, Mahmoud; DeVoe, Jennifer E
2013-08-01
This study examined adults' self-reported understanding and formatting preferences of medical statistics, confidence in self-care and ability to obtain health advice or information, and perceptions of patient-health-care provider communication measured through dual survey modes (random digital dial and mail). Even while controlling for sociodemographic characteristics, significant differences in regard to adults' responses to survey variables emerged as a function of survey mode. While the analyses do not allow us to pinpoint the underlying causes of the differences observed, they do suggest that mode of administration should be carefully adjusted for and considered.
Norén, Patrik
2013-01-01
Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...
Reliability Generalization: "Lapsus Linguae"
Smith, Julie M.
2011-01-01
This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-01-01
Purpose: Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable…
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-01-01
Purpose: Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable…
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Kroneman, M.W.; Essen, G.A. van; Tacken, M.A.J.B.; Paget, W.J.; Verheij, R.
2004-01-01
All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of
Kroneman, M.W.; Essen, G.A.; Tacken, M.A.J.B.; Paget, W.J.; Verheij, R.
2004-01-01
All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of
Kroneman, M.W.; Essen, G.A.; Tacken, M.A.J.B.; Paget, W.J.; Verheij, R.
2004-01-01
All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of obt
Kroneman, M.W.; Essen, G.A. van; Tacken, M.A.J.B.; Paget, W.J.; Verheij, R.
2004-01-01
All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of obt
查静; 宁毅
2015-01-01
文章首先回顾了信度和效度的概念以及检测信度和效度的方法，以此为依据，将收集到的电脑评分和专家人工评分的数据进行了相关性分析、信度检验、重复性方差分析、独立样本t检验以及定性分析等各项分析，多方位地多元评分系统的信度和效度进行了验证。结果表明，本系统具有良好的内部一致性，信度较好，但是初评分比例较高时，信度较低；与专家评分的结果对比研究表明，自动评分系统结果对说明文和应用文体两种文体写作能力解释力较差。%The paper sets out to analyze the reliability and validity of automated essay scoring with E-scorer. It begins with a review on previous empirical researches on the reliability and validity of automated essay scoring systems.The evidence based approach is used in this study to examine the reliability of automated scoring and com-pare the automated and human scores.The relevance test, reliability statistics, repeated measures and independent samples T-test show that the automated scoring system is highly reliable when the proportion of preliminary scores is below 50%.However, the statistical and quality analyses show that the scoring results from automated scoring system are not able to provide good interpretation for practical and informative writing ability.
Sadovnick A Dessa
2009-02-01
Full Text Available Abstract Background Multiple sclerosis (MS is a complex trait in which genes in the MHC class II region exert the single strongest effect on genetic susceptibility. The principal MHC class II haplotype that increases MS risk in individuals of Northern European descent are those that bear HLA-DRB1*15. However, several other HLA-DRB1 alleles have been positively and negatively associated with MS and each of the main allelotypes is composed of many sub-allelotypes with slightly different sequence composition. Given the role of this locus in antigen presentation it has been suggested that variations in the peptide binding site of the allele may underlie allelic variation in disease risk. Methods In an investigation of 7,333 individuals from 1,352 MS families, we assessed the nucleotide sequence of HLA-DRB1 for any effects on disease susceptibility extending a recently published method of statistical analysis for family-based association studies to the particular challenges of hyper-variable genetic regions. Results We found that amino acid 60 of the HLA-DRB1 peptide sequence, which had previously been postulated based on structural features, is unlikely to play a major role. Instead, empirical evidence based on sequence information suggests that MS susceptibility arises primarily from amino acid 13. Conclusion Identifying a single amino acid as a major risk factor provides major practical implications for risk and for the exploration of mechanisms, although the mechanism of amino acid 13 in the HLA-DRB1 sequence's involvement in MS as well as the identity of additional variants on MHC haplotypes that influence risk need to be uncovered.
Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.
2015-12-01
The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.
Mathematical and statistical analysis
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Runeson, Bo
2015-12-08
Identifying individuals at risk of future suicide or suicide attempts is of clinical importance. Instruments have been developed to facilitate the assessment of the risk of future suicidal acts. A systematic review was conducted using the standard methods of the Swedish Council on Health Technology Assessment (SBU). The ability of the instrument to predict risk for future suicide/suicide attempt was assessed at follow up. The methodological quality of eligible studies was assessed; studies with moderate or low risk of bias were analysed in accordance with GRADE. None of the included studies provided scientific evidence to support that any instrument had sufficient accuracy to predict future suicidal behaviour. There is strong evidence to support that the SAD PERSONS Scale has very low sensitivity; most persons who make future suicidal acts are not identified.
Gaus, Wilhelm
2014-09-02
The US National Toxicology Program (NTP) is assessed by a statistician. In the NTP-program groups of rodents are fed for a certain period of time with different doses of the substance that is being investigated. Then the animals are sacrificed and all organs are examined pathologically. Such an investigation facilitates many statistical tests. Technical Report TR 578 on Ginkgo biloba is used as an example. More than 4800 statistical tests are possible with the investigations performed. Due to a thought experiment we expect >240 false significant tests. In actuality, 209 significant pathological findings were reported. The readers of Toxicology Letters should carefully distinguish between confirmative and explorative statistics. A confirmative interpretation of a significant test rejects the null-hypothesis and delivers "statistical proof". It is only allowed if (i) a precise hypothesis was established independently from the data used for the test and (ii) the computed p-values are adjusted for multiple testing if more than one test was performed. Otherwise an explorative interpretation generates a hypothesis. We conclude that NTP-reports - including TR 578 on Ginkgo biloba - deliver explorative statistics, i.e. they generate hypotheses, but do not prove them.
Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed
Kottner, Jan; Audigé, Laurent; Brorson, Stig;
2011-01-01
Results of reliability and agreement studies are intended to provide information about the amount of error inherent in any diagnosis, score, or measurement. The level of reliability and agreement among users of scales, instruments, or classifications is widely unknown. Therefore, there is a need...... for rigorously conducted interrater and intrarater reliability and agreement studies. Information about sample selection, study design, and statistical analysis is often incomplete. Because of inadequate reporting, interpretation and synthesis of study results are often difficult. Widely accepted criteria......, standards, or guidelines for reporting reliability and agreement in the health care and medical field are lacking. The objective was to develop guidelines for reporting reliability and agreement studies....
Joaquín Hernández-Palazón
2015-03-01
Full Text Available Background: Anxiety is an emotional state characterized by apprehension and fear resulting from anticipation of a threatening event. Objectives: The present study aimed to analyze the incidence and level of preoperative anxiety in the patients scheduled for cardiac surgery by using a Visual Analogue Scale for Anxiety (VAS-A and Amsterdam Preoperative Anxiety and Information Scale (APAIS and to identify the influencing clinical factors. Patients and Methods: This prospective, longitudinal study was performed on 300 cardiac surgery patients in a single university hospital. The patients were assessed regarding their preoperative anxiety level using VAS-A, APAIS, and a set of specific anxiety-related questions. Their demographic features as well as their anesthetic and surgical characteristics (ASA physical status, EuroSCORE, preoperative Length of Stay (LoS, and surgical history were recorded, as well. Then, one-way ANOVA and t-test were applied along with odds ratio for risk assessment. Results: According to the results, 94% of the patients presented preoperative anxiety, with 37% developing high anxiety (VAS-A ≥ 7. Preoperative LoS > 2 days was the only significant risk factor for preoperative anxiety (odds ratio = 2.5, CI 95%, 1.3 - 5.1, P = 0.009. Besides, a positive correlation was found between anxiety level (APAISa and requirement of knowledge (APAISk. APAISa and APAISk scores were greater for surgery than for anesthesia. Moreover, the results showed that the most common anxieties resulted from the operation, waiting for surgery, not knowing what is happening, postoperative pain, awareness during anesthesia, and not awakening from anesthesia. Conclusions: APAIS and VAS-A provided a quantitative assessment of anxiety and a specific qualitative questionnaire for preoperative anxiety in cardiac surgery. According to the results, preoperative LoS > 2 days and lack of information related to surgery were the risk factors for high anxiety levels.
The Accelerator Reliability Forum
Lüdeke, Andreas; Giachino, R
2014-01-01
A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-02
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (pGinkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors.
Antczak, K; Wilczyńska, U
1980-01-01
2 statistical models for evaluation of toxicological studies results have been presented. Model I. after R. Hoschek and H. J. Schittke (2) involves: 1. Elimination of the values deviating from most results-by Grubbs' method (2). 2. Analysis of the differences between the results obtained by the participants of the action and tentatively assumed value. 3. Evaluation of significant differences between the reference value and average value for a given series of measurements. 4. Thorough evaluation of laboratories based on evaluation coefficient fx. Model II after Keppler et al. As a criterion for evaluating the results the authors assumed the median. Individual evaluation of laboratories was performed on the basis of: 1. Adjusted test "t" 2. Linear regression test.
Kopáček Jaroslav
2016-01-01
Full Text Available This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element, which is seen as a random variable and their data (values can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.
Introduction to quality and reliability engineering
Jiang, Renyan
2015-01-01
This book presents the state-of-the-art in quality and reliability engineering from a product life cycle standpoint. Topics in reliability include reliability models, life data analysis and modeling, design for reliability and accelerated life testing, while topics in quality include design for quality, acceptance sampling and supplier selection, statistical process control, production tests such as screening and burn-in, warranty and maintenance. The book provides comprehensive insights into two closely related subjects, and includes a wealth of examples and problems to enhance reader comprehension and link theory and practice. All numerical examples can be easily solved using Microsoft Excel. The book is intended for senior undergraduate and post-graduate students in related engineering and management programs such as mechanical engineering, manufacturing engineering, industrial engineering and engineering management programs, as well as for researchers and engineers in the quality and reliability fields. D...
Reliability-based optimization of engineering structures
Sørensen, John Dalsgaard
2008-01-01
The theoretical basis for reliability-based structural optimization within the framework of Bayesian statistical decision theory is briefly described. Reliability-based cost benefit problems are formulated and exemplitied with structural optimization. The basic reliability-based optimization prob...
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Power electronics reliability analysis.
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.
2010-01-01
A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
吕峰
2011-01-01
通过对发电投产12年来天荒坪电站设备运行和故障情况的统计，分析电站设备消缺和健康运行水平及影响可靠性指标的主要因素，提出相应对策，对设备运行监测重点提出建议。%The statistics was conducted on the equipment operation and failures in Tianhuangping Pumped-Storage Power Station for its 12-year power generation. This paper analyzes the main factors influencing the level of equipment faults elimination and healthy operation, as well as the reliability index, and proposes the corresponding countermeasures. Finally, the equipment operation monitoring focus was suggested.
Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert
2016-04-15
This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.
Circuit design for reliability
Cao, Yu; Wirth, Gilson
2015-01-01
This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units. The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.
Reliability and Its Quantitative Measures
Alexandru ISAIC-MANIU
2010-01-01
Full Text Available In this article is made an opening for the software reliability issues, through wide-ranging statistical indicators, which are designed based on information collected from operating or testing (samples. It is developed the reliability issues also for the case of the main reliability laws (exponential, normal, Weibull, which validated for a particular system, allows the calculation of some reliability indicators with a higher degree of accuracy and trustworthiness
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Improving machinery reliability
Bloch, Heinz P
1998-01-01
This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.
Lazzaroni, Massimo
2012-01-01
This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be
Reliability design method for steam turbine blades
Jinyuan SHI
2008-01-01
Based on theories of probability and statistics, and taking static stresses, dynamic stresses, endurance strength, safety ratios, vibration frequencies and exciting force frequencies of blades as random variables, a reliabil-ity design method for steam turbine blades is presented. The purport and calculation method for blade reliability are expounded. The distribution parameters of random variables are determined after analysis and numerical cal-culation of test data. The fatigue strength and the vibra-tion design reliability of turbine blades are determined with the aid of a probabilistic design method and by inter-ference models for stress distribution and strength distri-bution. Some blade reliability design calculation formulas for a dynamic stress design method, a safety ratio design method for fatigue strength, and a vibration reliability design method for the first and second types of tuned blades and a packet of blades on a disk connected closely, are given together with some practical examples. With these methods, the design reliability of steam turbine blades can be guaranteed in the design stage. This research may provide some scientific basis for reliability design of steam turbine blades.
Common pitfalls in statistical analysis: Clinical versus statistical significance
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754
Freund, Rudolf J; Wilson, William J
2010-01-01
Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Faerch, Kristine; Lau, Cathrine; Tetens, Inge; Pedersen, Oluf Borbye; Jørgensen, Torben; Borch-Johnsen, Knut; Glümer, Charlotte
2005-05-01
Most studies analyzing diet-disease relations focus on single dietary factors rather than combining different nutrients into the same statistical model. The objective of this study was to identify dietary factors associated with the probability of having diabetes identified by screening (SDM) in Danish men and women aged 30-60 y. A specific objective was to examine whether an alternative statistical approach could provide additional information to already existing statistical approaches used in nutritional epidemiology. Baseline data from the Danish population-based Inter99 study were used. The dietary intake of 262 individuals with SDM was compared with that of 4627 individuals with normal glucose tolerance (NGT) using 2 different types of multiple logistic regression models adjusted for potential confounders. The first model included single dietary factors, whereas the second model was based on substitution of macronutrients. In the models with single dietary factors, high intakes of carbohydrates, dietary fiber, and coffee were inversely associated with SDM (P < 0.01), whereas high intakes of total fat and saturated fat were positively associated with SDM (P < 0.05). A modest U-shaped association was found between alcohol consumption and SDM (P = 0.10) [corrected] Results from the substitution model showed that when 3% of energy (En%) as carbohydrate replaced 3 En% fat or alcohol, the probability of having SDM decreased by 9 and 10%, respectively (P < 0.01) [corrected] No other macronutrient substitutions resulted in significant associations. Hence, the statistical approach based on substitution of macronutrients provided additional information to the model analyzing single dietary factors.
Reliability in automotive ethernet networks
Soares, Fabio L.; Campelo, Divanilson R.; Yan, Ying;
2015-01-01
This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular.......This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular....
Reliability Centered Maintenance - Methodologies
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
Ying He (Vattenfall Research and Development AB, Stockholm (SE))
2007-09-15
In risk analysis of a power system, the risk for the system to fail power supply is calculated from the knowledge of the reliability data of individual system components. Meaningful risk analysis requires reasonable and acceptable data. The quality of the data has the fundamental importance for the analysis. However, the valid data are expensive to collect. The component reliability performance statistics are not easy to obtain. This report documents the distribution equipment reliability data developed by the project 'Component Reliability Data for Risk Analysis of Distribution Systems' within the Elforsk RandD program 'Risk Analysis 06-10'. The project analyzed a large sample size of distribution outages recorded by more than a hundred power utilities in Sweden during 2004-2005, and derived the equipment reliability data nationwide. The detailed summaries of these data are presented in the appendices of the report. The component reliability was also investigated at a number of power utilities including Vattenfall Eldistribution AB, Goeteborg Energi Naet AB, E.ON Elnaet Sverige AB, Fortum Distribution, and Linde Energi AB. The reliability data were derived for individual utilities. The detailed data lists and failure statistics are summarized in the appendices for each participating companies. The data provided in this report are developed based on a large sample size of field outage records and can be therefore used as generic data in system risk analysis and reliability studies. In order to provide more references and complementary data, the equipment reliability surveys conducted by IEEE were studied in the project. The most significant results obtained by the IEEE surveys are provided in the report. A summary of the reliability data surveyed by IEEE is presented in the appendix of the report. These data are suggested to use in the absence of better data being available. The reliability data estimates were derived for sustained failure rates
2017-01-17
convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them. This report was cleared for public release...testing for reliability prediction of devices exhibiting multiple failure mechanisms. Also presented was an integrated accelerating and measuring ...13 Table 2 T, V, F and matrix versus measured FIT
Statistical modeling for degradation data
Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru
2017-01-01
This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.
On reliable discovery of molecular signatures
Björkegren Johan
2009-01-01
Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.
Nuclear weapon reliability evaluation methodology
Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)
1993-06-01
This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.
Statistical significance test for transition matrices of atmospheric Markov chains
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
Industrial psychology students’ attitudes towards statistics
Sanet Coetzee
2010-03-01
Full Text Available Orientation: The attitude of students toward statistics may influence their enrolment, achievement and motivation in the subject of research and Industrial Psychology.Research purpose: The aims of this study were to determine the reliability and validity of the survey of attitudes toward statistics (SATS-36 for a South African sample and to determine whether biographical variables influence students’ attitudes.Motivation for study: Students could be better prepared for, and guided through, a course in statistics if more is known about their attitudes towards statistics.Research design, approach and method: A cross-sectional survey design was used and the SATS-36 was administered to a sample of convenience consisting of 235 students enrolled in Industrial and Organisational Psychology at a large tertiary institution in South Africa.Main findings: Results revealed that even though students perceive statistics to be technical, complicated and difficult to master, they are interested in the subject and believe statistics to be of value. The degree to which students perceived themselves to be competent in mathematics was related to the degree to which they felt confident in their own ability to master statistics. Males displayed slightly more positive feelings toward statistics than females. Older students perceived statistics to be less difficult than younger students and also displayed slightly more positive feelings concerning statistics.Practical implications: It seems that in preparing students for statistics, their perception regarding their mathematical competence could be managed as well.Contribution: This study provides the first preliminary evidence for the reliability and validity of the SATS-36 for a sample of South African students.
Industrial psychology students’ attitudes towards statistics
Sanet Coetzee
2010-03-01
Full Text Available Orientation: The attitude of students toward statistics may influence their enrolment, achievement and motivation in the subject of research and Industrial Psychology.Research purpose: The aims of this study were to determine the reliability and validity of the survey of attitudes toward statistics (SATS-36 for a South African sample and to determine whether biographical variables influence students’ attitudes.Motivation for study: Students could be better prepared for, and guided through, a course in statistics if more is known about their attitudes towards statistics.Research design, approach and method: A cross-sectional survey design was used and the SATS-36 was administered to a sample of convenience consisting of 235 students enrolled in Industrial and Organisational Psychology at a large tertiary institution in South Africa.Main findings: Results revealed that even though students perceive statistics to be technical, complicated and difficult to master, they are interested in the subject and believe statistics to be of value. The degree to which students perceived themselves to be competent in mathematics was related to the degree to which they felt confident in their own ability to master statistics. Males displayed slightly more positive feelings toward statistics than females. Older students perceived statistics to be less difficult than younger students and also displayed slightly more positive feelings concerning statistics.Practical implications: It seems that in preparing students for statistics, their perception regarding their mathematical competence could be managed as well.Contribution: This study provides the first preliminary evidence for the reliability and validity of the SATS-36 for a sample of South African students.
The Highest & Lowest Reliability Achievable with Redundancy
Becker, Peter W.
1977-01-01
-dependent, it is difficult to assess the reliability of the system. The paper describes the ¿-transformation by which the highest and lowest reliability achievable can be determined for a configuration using components with specified reliabilities. As a by-product we become able to pinpoint the statistical relationships...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... fact sheet provides up-to-date data and statistics on youth violence. Data Sources Indicators of School ...
Saiz, P; Rocha, R; Andreeva, J
2007-01-01
We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...
Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview.
Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee
2013-04-01
Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda.
A whistle-stop tour of statistics
Everitt, Brian S
2011-01-01
""I think that Everitt has been quite successful. All the standard topics, such as probability, estimation, hypothesis testing, analysis of variance, and regression, are discussed. The discussion is short, to the point, readable, and reliable. In addition, there are the more 'advanced' topics of logistic regression, survival analysis, longitudinal data analysis, and multivariate analysis, thus providing the reader with a short introduction to a wide array of the methods in the statistical arsenal without getting bogged down in detail. … a brief, but good, introduction to statistics. … excellen
Calibrating ensemble reliability whilst preserving spatial structure
Jonathan Flowerdew
2014-03-01
Full Text Available Ensemble forecasts aim to improve decision-making by predicting a set of possible outcomes. Ideally, these would provide probabilities which are both sharp and reliable. In practice, the models, data assimilation and ensemble perturbation systems are all imperfect, leading to deficiencies in the predicted probabilities. This paper presents an ensemble post-processing scheme which directly targets local reliability, calibrating both climatology and ensemble dispersion in one coherent operation. It makes minimal assumptions about the underlying statistical distributions, aiming to extract as much information as possible from the original dynamic forecasts and support statistically awkward variables such as precipitation. The output is a set of ensemble members preserving the spatial, temporal and inter-variable structure from the raw forecasts, which should be beneficial to downstream applications such as hydrological models. The calibration is tested on three leading 15-d ensemble systems, and their aggregation into a simple multimodel ensemble. Results are presented for 12 h, 1° scale over Europe for a range of surface variables, including precipitation. The scheme is very effective at removing unreliability from the raw forecasts, whilst generally preserving or improving statistical resolution. In most cases, these benefits extend to the rarest events at each location within the 2-yr verification period. The reliability and resolution are generally equivalent or superior to those achieved using a Local Quantile-Quantile Transform, an established calibration method which generalises bias correction. The value of preserving spatial structure is demonstrated by the fact that 3×3 averages derived from grid-scale precipitation calibration perform almost as well as direct calibration at 3×3 scale, and much better than a similar test neglecting the spatial relationships. Some remaining issues are discussed regarding the finite size of the output
师雪姣; 常春起; 胡南; 孙兵
2015-01-01
目的 基于MRI技术,采用基于序统计量相关系数法的脑功能连接网络分析方法研究静息态下语言网络活动情况,并探讨其时间可靠性和功能偏侧化,为静息态语言网络的临床研究提供理论基础.资料与方法 对25例正常志愿者进行3次静息态功能MRI扫描,32位Matlab 7.11.0及DPARSF软件进行数据处理,选取大脑左半球Broca和Wernicke两个语言脑功能区域为感兴趣区,采用序统计量相关系数法分析受试者的语言网络功能连接性.结果 基于种子分析方法的功能连接图,得到受试者的不对称指数图和组内相关系数图.采用序统计量相关系数法分析静息态语言网络的功能连接情况与传统相关系数法分析的结果相似度极高,该方法可用于类似网络功能连接情况的研究.结论 静息态大脑语言功能网络的时间可靠性为语言疾病的临床研究提供一定的参考价值,因语言网络功能偏侧化异常而引起的语言疾病或精神性疾病的临床诊疗,可借鉴静息态语言网络的功能偏侧化及其时间可靠性情况.%Purpose To investigate the activity of language network with brain function connection network analysis method using MRI order statistics correlation coefficient, and to explore the temporal reliability and functional asymmetry, and provide the theoretical foundation for clinical researches of resting-state language network.Materials and Methods Twenty-five healthy volunteers were scanned three times in resting state. All data were processed using 32 bit Matlab 7.11.0 and DPARSF. The two main language functional areas, Broca and Wernicke, were selected as the regions of interest. The functional connectivity of language network was analyzed with order statistics correlation coefficient.Results Based on the functional connectivity diagrams using seed analysis method, the asymmetry index and intra-class correlation were obtained. The functional connectivity of resting
Shapiro, Andrew A.
2006-01-01
Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.
NONE
2010-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Basu, Asit P; Basu, Sujit K
1998-01-01
This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Statistics Essentials For Dummies
Rumsey, Deborah
2010-01-01
Statistics Essentials For Dummies not only provides students enrolled in Statistics I with an excellent high-level overview of key concepts, but it also serves as a reference or refresher for students in upper-level statistics courses. Free of review and ramp-up material, Statistics Essentials For Dummies sticks to the point, with content focused on key course topics only. It provides discrete explanations of essential concepts taught in a typical first semester college-level statistics course, from odds and error margins to confidence intervals and conclusions. This guide is also a perfect re
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
On reliability analysis of multi-categorical forecasts
J. Bröcker
2008-08-01
Full Text Available Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.
Quantitative metal magnetic memory reliability modeling for welded joints
Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng
2016-03-01
Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.
Expert system aids reliability
Johnson, A.T. [Tennessee Gas Pipeline, Houston, TX (United States)
1997-09-01
Quality and Reliability are key requirements in the energy transmission industry. Tennessee Gas Co. a division of El Paso Energy, has applied Gensym`s G2, object-oriented Expert System programming language as a standard tool for maintaining and improving quality and reliability in pipeline operation. Tennessee created a small team of gas controllers and engineers to develop a Proactive Controller`s Assistant (ProCA) that provides recommendations for operating the pipeline more efficiently, reliably and safely. The controller`s pipeline operating knowledge is recreated in G2 in the form of Rules and Procedures in ProCA. Two G2 programmers supporting the Gas Control Room add information to the ProCA knowledge base daily. The result is a dynamic, constantly improving system that not only supports the pipeline controllers in their operations, but also the measurement and communications departments` requests for special studies. The Proactive Controller`s Assistant development focus is in the following areas: Alarm Management; Pipeline Efficiency; Reliability; Fuel Efficiency; and Controller Development.
Eugster, P.; Guerraoui, R.; Kouznetsov, P.
2001-01-01
This paper presents a new, non-binary measure of the reliability of broadcast algorithms, called Delta-Reliability. This measure quantifies the reliability of practical broadcast algorithms that, on the one hand, were devised with some form of reliability in mind, but, on the other hand, are not considered reliable according to the ``traditional'' notion of broadcast reliability [HT94]. Our specification of Delta-Reliability suggests a further step towards bridging the gap between theory and...
Frontiers in statistical quality control
Wilrich, Peter-Theodor
2004-01-01
This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.
Reliability computation from reliability block diagrams
Chelson, P. O.; Eckstein, E. Y.
1975-01-01
Computer program computes system reliability for very general class of reliability block diagrams. Four factors are considered in calculating probability of system success: active block redundancy, standby block redundancy, partial redundancy, and presence of equivalent blocks in the diagram.
Creep-rupture reliability analysis
Peralta-Duran, A.; Wirsching, P. H.
1985-01-01
A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.
Medicare Provider Data - Hospice Providers
U.S. Department of Health & Human Services — The Hospice Utilization and Payment Public Use File provides information on services provided to Medicare beneficiaries by hospice providers. The Hospice PUF...
Low-Complex Reliable Communications between Wireless Network-Nodes
Alois Goiser
2015-11-01
Full Text Available We present a low-complex blind interference reduction scheme embedded in the receiver to enhance correlative data detection. The key element is a statistically controlled adaptive nonlinearity prior to correlation. This add-on feature guaranties reliable communication between network-nodes in any type of noise and/or interference. Additionally the concept provides accurate position measurements to make services flexible and intelligent.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Reliability and construction control
Sherif S. AbdelSalam
2016-06-01
Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.
Methodological difficulties of conducting agroecological studies from a statistical perspective
Bianconi, A.; Dalgaard, Tommy; Manly, Bryan F J
2013-01-01
and accurate manner. Therefore, our goal in this paper is to discuss the importance of statistical tools for alternative agronomic approaches, because alternative approaches, such as organic farming, should not only be promoted by encouraging farmers to deploy agroecological techniques, but also by providing......Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable...... agroecologists with robust analyses based on rigorous statistical procedures....
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Verweij, Jan F.
1993-01-01
Several issue's regarding VLSI reliability research in Europe are discussed. Organizations involved in stimulating the activities on reliability by exchanging information or supporting research programs are described. Within one such program, ESPRIT, a technical interest group on IC reliability was
Ross, Sheldon M
2005-01-01
In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin
Ross, Sheldon M
2010-01-01
In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory H
2010-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
王炳兴
2012-01-01
The maximum likelihood estimation and generalized confidence interval of the system reliability of general multi-component stress-strength model are derived when the component strengths and the stress are independent exponential random variables. The goodness-of-fit test of the model is discussed. The coverage probability of the proposed generalized confidence interval and the power of the test are evaluated by Monte Carlo simulation. The simulation results show that the proposed generalized confidence interval and test are very satisfactory even for small and moderate samples. Finally, an example is shown to illustrate the proposed procedures.%利用最大似然估计和广义区间估计方法,研究了元件的强度和承受的应力均服从指数分布时系统应力强度模型可靠度的估计问题,导出了可靠度的最大似然估计和广义区间估计.同时也讨论了模型的拟合检验问题.利用模拟方法研究了提出的广义置信区间的覆盖率和拟合检验的功效,模拟结果表明提出的广义置信区间的覆盖率与名义置信系数是一致的,提出的拟合检验的功效是好的.最后用一个例子说明提出的方法.
Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Reliability of the NINDS Myotatic Reflex Scale.
Litvan, I; Mangone, C A; Werden, W; Bueri, J A; Estol, C J; Garcea, D O; Rey, R C; Sica, R E; Hallett, M; Bartko, J J
1996-10-01
The assessment of deep tendon reflexes is useful for localization and diagnosis of neurologic disorders, but only a few studies have evaluated their reliability. We assessed the reliability of four neurologists, instructed in two different countries, in using the National Institute of Neurological Disorders and Stroke (NINDS) Myotatic Reflex Scale. To evaluate the role of training in using the scale, the neurologists randomly and blindly evaluated a total of 80 patients, 40 before and 40 after a training session. Inter- and intraobserver reliability were measured with kappa statistics. Our results showed substantial to near-perfect intraobserver reliability, and moderate-to-substantial interobserver reliability of the NINDS Myotatic Reflex Scale. The reproducibility was better for reflexes in the lower than in the upper extremities. Neither educational background nor the training session influenced the reliability of our results. The NINDS Myotatic Reflex Scale has sufficient reliability to be adopted as a universal scale.
... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
New Approaches to Reliability Assessment
Ma, Ke; Wang, Huai; Blaabjerg, Frede
2016-01-01
of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......Power electronics are facing continuous pressure to be cheaper and smaller, have a higher power density, and, in some cases, also operate at higher temperatures. At the same time, power electronics products are expected to have reduced failures because it is essential for reducing the cost......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...
Analysis of time-dependent reliability of degenerated reinforced concrete structure
Zhang Hongping
2016-07-01
Full Text Available Durability deterioration of structure is a highly random process. The maintenance of degenerated structure involves the calculation of the reliability of time-dependent structure. This study introduced reinforced concrete structure resistance decrease model and related statistical parameters of uncertainty, analyzed resistance decrease rules of corroded bending element of reinforced concrete structure, and finally calculated timedependent reliability of the corroded bending element of reinforced concrete structure, aiming to provide a specific theoretical basis for the application of time-dependent reliability theory.
Statistical methods in astronomy
Long, James P.; de Souza, Rafael S.
2017-01-01
We present a review of data types and statistical methods often encountered in astronomy. The aim is to provide an introduction to statistical applications in astronomy for statisticians and computer scientists. We highlight the complex, often hierarchical, nature of many astronomy inference problems and advocate for cross-disciplinary collaborations to address these challenges.
RELIABILITY AND DESIGN OF EXPERIMENTS
Adrian Stere PARIS
2013-05-01
Full Text Available The mechanical reliability uses many statistical tools to find the factors of influence and their levels inthe optimization of parameters on the basis of experimental data. Design of Experiments (DOE techniquesenables designers to determine simultaneously the individual and interactive effects of many factors that couldaffect the output results in any design. The state-of-the-art in the domain implies extended use of software and abasic mathematical knowledge, mainly applying ANOVA and the regression analysis of experimental data.
Sampling for assurance of future reliability
Klauenberg, Katy; Elster, Clemens
2017-02-01
Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.
2009-01-01
Ericsson is a global provider of telecommunications systems equipment and related services for mobile and fixed network operators. 3Gsim is a tool used by Ericsson in tests of the 3G RNC node. In order to validate the tests, statistics are constantly gathered within 3Gsim and users can use telnet to access the statistics using some system specific 3Gsim commands. The statistics can be retrieved but is unstructured for the human eye and needs parsing and arranging to be readable. The statist...
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Virtual private networks can provide reliable IT connections.
Kabachinski, Jeff
2006-01-01
A VPN is a private network that uses a public network, such as the Internet, to connect remote sites and users together. Instead of using a dedicated hard-wired connection as in a trusted connection or leased lines, a VPN uses a virtual connection routed through the Internet from the organization's private network to the remote site or employee. Typical VPN services allow for security in terms of data encryption as well as means to authenticate, authorize, and account for all the traffic. VPN services allow the organization to use whatever network operating system they wish as it also encapsulate your data into the protocols needed to transport data across public lines. The intention of this IT World article was to give the reader an introduction to VPNs. Keep in mind that there are no standard models for a VPN. You're likely to come across many vendors presenting the virtues of their VPN applications and devices when you Google "VPN." However the general uses, concepts, and principles outlined here should give you a fighting chance to read through the marketing language in the online ads and "white papers."
Reliability of novel postural sway task test
Milan Sedliak
2013-07-01
Full Text Available The purpose of this study was to examine the reliability of parameters obtained from a novel postural sway task test based on body movements controlled by visual feedback. Fifty-nine volunteers were divided into two groups. The first group consisted of young (n = 32, 16 females and 16 males, age: 25.2 ± 3.4 years and the second group of elderly individuals (n = 27, 17 females and 10 males, age: 75.7 ± 6.9 years. Participants stood in parallel on a computer based stabilographic platform with the feet approximately a shoulder width apart, the toes slightly pointing outwards, the hands placed on the hips. The computer screen was placed approximately 1.5 meter from the platform at a height of subjects’ eyes. An instantaneous visual feedback of participant’s centre of pressure (COP was given in a form of a blue cross visible on the screen. Participants were instructed to keep the blue cross driven by movements of their hips as close as possible to a predefined curve flowing on the screen. Out of the 6 parameters studied, only the average distance of COP from the curve line and the sum of the COP crossings through the curve line showed high reliability. Correlation between these two highly reliable parameters was -0.89. There was also a statistical difference (p<0.001 between young and elderly in both the average distance of COP from the curve line and the sum of the COP crossings through the curve. To conclude, the novel postural sway task provides a simple tool with relatively low time burden needed for testing. The suggested output parameters measured are highly reliable and easy to interpret.
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
Gallavotti, Giovanni
2011-01-01
C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.
Reliability and Probabilistic Risk Assessment - How They Play Together
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will
Reliability-based condition assessment of steel containment and liners
Ellingwood, B.; Bhattacharya, B.; Zheng, R. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering
1996-11-01
Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs.
Reliability analysis in intelligent machines
Mcinroy, John E.; Saridis, George N.
1990-01-01
Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.
Combination of structural reliability and interval analysis
Zhiping Qiu; Di Yang; saac Elishakoff
2008-01-01
In engineering applications,probabilistic reliability theory appears to be presently the most important method,however,in many cases precise probabilistic reliability theory cannot be considered as adequate and credible model of the real state of actual affairs.In this paper,we developed a hybrid of probabilistic and non-probabilistic reliability theory,which describes the structural uncertain parameters as interval variables when statistical data are found insufficient.By using the interval analysis,a new method for calculating the interval of the structural reliability as well as the reliability index is introduced in this paper,and the traditional probabilistic theory is incorporated with the interval analysis.Moreover,the new method preserves the useful part of the traditional probabilistic reliability theory,but removes the restriction of its strict requirement on data acquisition.Example is presented to demonstrate the feasibility and validity of the proposed theory.
Enlightenment on Computer Network Reliability From Transportation Network Reliability
Hu Wenjun; Zhou Xizhao
2011-01-01
Referring to transportation network reliability problem, five new computer network reliability definitions are proposed and discussed. They are computer network connectivity reliability, computer network time reliability, computer network capacity reliability, computer network behavior reliability and computer network potential reliability. Finally strategies are suggested to enhance network reliability.
International migration statistics in Mexico.
Garcia Y Griego, M
1987-01-01
During the past decade, Mexico has experienced both large-scale emigration directly, mostly to the US, and the mass immigration of Central American refugees. The implementation of the US Immigration and Control Act of 1986 and the possible escalation of armed conflicts in Central America may result in expanded inflows either of returning citizens or of new refugee waves. To develop appropriate policy responses, Mexico needs reliable information on international migration flows. This research note reviews available sources of that information--arrival and departure statistics, population censuses, refugee censuses, and survey data--and concludes that most of them are relatively weak. Currently, the published data on entries and departures provide little information on the demographic impact of legal migration, although they suggest that the inflow of foreigners is small. The census corroborates such findings, but it yields inadequate demographic detail. The movement of Mexican nationals, on the other hand, is poorly reflected by both sources. The void they leave has been palliated somewhat by surveys, but the only nationally representative survey on emigration was carried out in the late 1970s and might be a less than ideal basis for current policy formulation. It is hoped that as the relevance of international migration becomes more evident, steps towards the improvement of existing statistical systems may be undertaken. In the absence of such measures, policy-makers and researchers will have to continue relying on ad hoc surveys to answer the most pressing questions on the subject.
Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment
Kostandyan, Erik; Sørensen, John Dalsgaard
2012-01-01
is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...
Natrella, Mary Gibbons
2005-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Official Statistics and Statistics Education: Bridging the Gap
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Reliable Eigenspectra for New Generation Surveys
Budavari, Tamas; Szalay, Alexander S; Dobos, Laszlo; Yip, Ching-Wa
2008-01-01
We present a novel technique to overcome the limitations of the applicability of Principal Component Analysis to typical real-life data sets, especially astronomical spectra. Our new approach addresses the issues of outliers, missing information, large number of dimensions and the vast amount of data by combining elements of robust statistics and recursive algorithms that provide improved eigensystem estimates step-by-step. We develop a generic mechanism for deriving reliable eigenspectra without manual data censoring, while utilising all the information contained in the observations. We demonstrate the power of the methodology on the attractive collection of the VIMOS VLT Deep Survey spectra that manifest most of the challenges today, and highlight the improvements over previous workarounds, as well as the scalability of our approach to collections with sizes of the Sloan Digital Sky Survey and beyond.
Meneghetti, M; Dahle, H; Limousin, M
2013-01-01
The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...
Human Reliability Program Overview
Bodin, Michael
2012-09-25
This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.
Modern engineering statistics, solutions manual
Ryan, Thomas P
2012-01-01
An introductory perspective on statistical applications in the field of engineering Modern Engineering Statistics presents state-of-the-art statistical methodology germane to engineering applications. With a nice blend of methodology and applications, this book provides and carefully explains the concepts necessary for students to fully grasp and appreciate contemporary statistical techniques in the context of engineering. With almost thirty years of teaching experience, many of which were spent teaching engineering statistics courses, the author has successfully developed a
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?
Medicare and Medicaid Statistical Supplement
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics (OEDA) produced an annual Medicare and Medicaid Statistical Supplement report providing detailed statistical...
National Center for Health Statistics
... Submit Search The CDC National Center for Health Statistics Note: Javascript is disabled or is not supported ... Survey of Family Growth Vital Records National Vital Statistics System National Death Index Provider Surveys National Health ...
Viking Lander reliability program
Pilny, M. J.
1978-01-01
The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.
Quantum information theory and quantum statistics
Petz, D. [Alfred Renyi Institute of Mathematics, Budapest (Hungary)
2008-07-01
Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Statistical Methods for Astronomy
Feigelson, Eric D
2012-01-01
This review outlines concepts of mathematical statistics, elements of probability theory, hypothesis tests and point estimation for use in the analysis of modern astronomical data. Least squares, maximum likelihood, and Bayesian approaches to statistical inference are treated. Resampling methods, particularly the bootstrap, provide valuable procedures when distributions functions of statistics are not known. Several approaches to model selection and good- ness of fit are considered. Applied statistics relevant to astronomical research are briefly discussed: nonparametric methods for use when little is known about the behavior of the astronomical populations or processes; data smoothing with kernel density estimation and nonparametric regression; unsupervised clustering and supervised classification procedures for multivariate problems; survival analysis for astronomical datasets with nondetections; time- and frequency-domain times series analysis for light curves; and spatial statistics to interpret the spati...
Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine
Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon
2013-01-01
as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...... to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda....
Preskill, J
1997-01-01
The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mistakes occur during the recovery procedure. Furthermore, encoded quantum information can be processed without serious propagation of errors. Hence, an arbitrarily long quantum computation can be performed reliably, provided that the average probability of error per quantum gate is less than a certain critical value, the accuracy threshold. A quantum computer storing about 10^6 qubits, with a probability of error per quantum gate of order 10^{-6}, would be a formidable factoring engine. Even a smaller, less accurate quantum computer would be able to perform many useful tasks. (This paper is based on a talk presented at the ITP Conference on Quantum Coherence and Decoherence, 15-18 December 1996.)
2012-01-01
In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Statistics of football dynamics
Mendes, R S; Anteneodo, C
2007-01-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by $q$-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Approximating Stationary Statistical Properties
Xiaoming WANG
2009-01-01
It is well-known that physical laws for large chaotic dynamical systems are revealed statistically. Many times these statistical properties of the system must be approximated numerically. The main contribution of this manuscript is to provide simple and natural criterions on numerical methods (temporal and spatial discretization) that are able to capture the stationary statistical properties of the underlying dissipative chaotic dynamical systems asymptotically. The result on temporal approximation is a recent finding of the author, and the result on spatial approximation is a new one. Applications to the infinite Prandtl number model for convection and the barotropic quasi-geostrophic model are also discussed.
2010-01-01
Abstract Background For years the Robert Koch Institute (RKI) has been annually pooling and reviewing the data from the German population-based cancer registries and evaluating them together with the cause-of-death statistics provided by the statistical offices. Traditionally, the RKI periodically estimates the number of new cancer cases in Germany on the basis of the available data from the regional cancer registries in which registration is complete; this figure, in turn, forms the basis fo...
Business statistics I essentials
Clark, Louise
2014-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t
U.S. Department of Health & Human Services — This reference provides significant summary information about health expenditures and the Centers for Medicare & Medicaid Services' (CMS) programs. The...
Reliability Analysis of DOOF for Weibull Distribution
陈文华; 崔杰; 樊小燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.
Sylvie Fortier
2005-06-01
Full Text Available The purpose of this study was twofold: (a to examine if kinetic and kinematic parameters of the sprint start could differentiate elite from sub-elite sprinters and, (b to investigate whether providing feedback (FB about selected parameters could improve starting block performance of intermediate sprinters over a 6-week training period. Twelve male sprinters, assigned to an elite or a sub-elite group, participated in Experiment 1. Eight intermediate sprinters participated in Experiment 2. All athletes were required to perform three sprint starts at maximum intensity followed by a 10-m run. To detect differences between elite and sub-elite groups, comparisons were made using t-tests for independent samples. Parameters reaching a significant group difference were retained for the linear discriminant analysis (LDA. The LDA yielded four discriminative kinetic parameters. Feedback about these selected parameters was given to sprinters in Experiment 2. For this experiment, data acquisition was divided into three periods. The first six sessions were without specific FB, whereas the following six sessions were enriched by kinetic FB. Finally, athletes underwent a retention session (without FB 4 weeks after the twelfth session. Even though differences were found in the time to front peak force, the time to rear peak force, and the front peak force in the retention session, the results of the present study showed that providing FB about selected kinetic parameters differentiating elite from sub-elite sprinters did not improve the starting block performance of intermediate sprinters
Network-based statistical comparison of citation topology of bibliographic databases
Lovro Šubelj; Dalibor Fiala; Marko Bajec
2014-01-01
Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant i...
Energy Statistics Manual [Arabic version
NONE
2011-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Energy Statistics Manual [Chinese version
NONE
2007-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Energy Statistics Manual; Handbuch Energiestatistik
NONE
2005-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
MEMS reliability: The challenge and the promise
Miller, W.M.; Tanner, D.M.; Miller, S.L.; Peterson, K.A.
1998-05-01
MicroElectroMechanical Systems (MEMS) that think, sense, act and communicate will open up a broad new array of cost effective solutions only if they prove to be sufficiently reliable. A valid reliability assessment of MEMS has three prerequisites: (1) statistical significance; (2) a technique for accelerating fundamental failure mechanisms, and (3) valid physical models to allow prediction of failures during actual use. These already exist for the microelectronics portion of such integrated systems. The challenge lies in the less well understood micromachine portions and its synergistic effects with microelectronics. This paper presents a methodology addressing these prerequisites and a description of the underlying physics of reliability for micromachines.
Decision theory in structural reliability
Thomas, J. M.; Hanagud, S.; Hawk, J. D.
1975-01-01
Some fundamentals of reliability analysis as applicable to aerospace structures are reviewed, and the concept of a test option is introduced. A decision methodology, based on statistical decision theory, is developed for determining the most cost-effective design factor and method of testing for a given structural assembly. The method is applied to several Saturn V and Space Shuttle structural assemblies as examples. It is observed that the cost and weight features of the design have a significant effect on the optimum decision.
Students' attitudes towards learning statistics
Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah
2015-05-01
Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.
Reliability and safety engineering
Verma, Ajit Kumar; Karanki, Durga Rao
2016-01-01
Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...
Statistics with JMP graphs, descriptive statistics and probability
Goos, Peter
2015-01-01
Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic
Measurement System Reliability Assessment
Kłos Ryszard
2015-06-01
Full Text Available Decision-making in problem situations is based on up-to-date and reliable information. A great deal of information is subject to rapid changes, hence it may be outdated or manipulated and enforce erroneous decisions. It is crucial to have the possibility to assess the obtained information. In order to ensure its reliability it is best to obtain it with an own measurement process. In such a case, conducting assessment of measurement system reliability seems to be crucial. The article describes general approach to assessing reliability of measurement systems.
Dai, Honghua; Smirnov, Evgueni
2012-01-01
Reliable Knowledge Discovery focuses on theory, methods, and techniques for RKDD, a new sub-field of KDD. It studies the theory and methods to assure the reliability and trustworthiness of discovered knowledge and to maintain the stability and consistency of knowledge discovery processes. RKDD has a broad spectrum of applications, especially in critical domains like medicine, finance, and military. Reliable Knowledge Discovery also presents methods and techniques for designing robust knowledge-discovery processes. Approaches to assessing the reliability of the discovered knowledge are introduc
Harrison, JM; Robbins, JM; 10.1098/rspa.2010.0254
2011-01-01
Quantum graphs are commonly used as models of complex quantum systems, for example molecules, networks of wires, and states of condensed matter. We consider quantum statistics for indistinguishable spinless particles on a graph, concentrating on the simplest case of abelian statistics for two particles. In spite of the fact that graphs are locally one-dimensional, anyon statistics emerge in a generalized form. A given graph may support a family of independent anyon phases associated with topologically inequivalent exchange processes. In addition, for sufficiently complex graphs, there appear new discrete-valued phases. Our analysis is simplified by considering combinatorial rather than metric graphs -- equivalently, a many-particle tight-binding model. The results demonstrate that graphs provide an arena in which to study new manifestations of quantum statistics. Possible applications include topological quantum computing, topological insulators, the fractional quantum Hall effect, superconductivity and molec...
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
Novel approach for evaluation of service reliability for electricity customers
JIANG; John; N
2009-01-01
Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.
Novel approach for evaluation of service reliability for electricity customers
KANG ChongQing; GAO Yan; JIANG John N; ZHONG Jin; XIA Qing
2009-01-01
Understanding reliability value for electricity customer is important to market-based reliability man-agement. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network ex-pansion for different reliability requirements of customers, which reveals the information about eco-nomic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reli-ability management and enhancement.
Silva, Valter; Grande, Antonio Jose; Rech, Cassiano Ricardo; Peccin, Maria Stella
2015-01-01
This study analyzes the reliability and validity of obesogenic built environments related to physical activity and chronic noncommunicable diseases through Google Maps in a heterogeneous urban area (i.e., residential and commercial, very poor and very rich) in São Paulo (SP), Brazil. There are no important differences when comparing virtual measures with street audit. Based on Kappa statistic, respectively for validity and reliability, 78% and 80% of outcomes were classified as nearly perfect agreement or substantial agreement. Virtual measures of geoprocessing via Google Maps provided high validity and reliability for assessing built environments.
Using the Weibull distribution reliability, modeling and inference
McCool, John I
2012-01-01
Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution
Estimating the Reliability of Electronic Parts in High Radiation Fields
Everline, Chester; Clark, Karla; Man, Guy; Rasmussen, Robert; Johnston, Allan; Kohlhase, Charles; Paulos, Todd
2008-01-01
Radiation effects on materials and electronic parts constrain the lifetime of flight systems visiting Europa. Understanding mission lifetime limits is critical to the design and planning of such a mission. Therefore, the operational aspects of radiation dose are a mission success issue. To predict and manage mission lifetime in a high radiation environment, system engineers need capable tools to trade radiation design choices against system design and reliability, and science achievements. Conventional tools and approaches provided past missions with conservative designs without the ability to predict their lifetime beyond the baseline mission.This paper describes a more systematic approach to understanding spacecraft design margin, allowing better prediction of spacecraft lifetime. This is possible because of newly available electronic parts radiation effects statistics and an enhanced spacecraft system reliability methodology. This new approach can be used in conjunction with traditional approaches for mission design. This paper describes the fundamentals of the new methodology.
Reliability analysis of an associated system
陈长杰; 魏一鸣; 蔡嗣经
2002-01-01
Based on engineering reliability of large complex system and distinct characteristic of soft system, some new conception and theory on the medium elements and the associated system are created. At the same time, the reliability logic model of associated system is provided. In this paper, through the field investigation of the trial operation, the engineering reliability of the paste fill system in No.2 mine of Jinchuan Non-ferrous Metallic Corporation is analyzed by using the theory of associated system.
The Concise Encyclopedia of Statistics
Dodge, Yadolah
2008-01-01
The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,
Evdokimov Sergey
2017-01-01
Full Text Available Current trends in construction are aimed at providing reliability and safety of engineering facilities. According to the latest government regulations for construction, the scientific approach to engineering research, design, construction and operation of construction projects is a key priority. The reliability of a road depends on a great number of factors and characteristics of their statistical compounds (sequential and parallel. A part of a road with such man-made structures as a bridge or a pipe is considered as a system with a sequential element connection. The overall reliability is the multiplication of the reliability of these elements. The parameters of engineering structures defined by analytical dependences are highly volatile because of the inaccuracy of the defining factors. However each physical parameter is statistically unstable that is evaluated by variable coefficient of their values. It causes the fluctuation in the parameters of engineering structures. Their study may result in the changes in general and particular design rules in order to increase the reliability. The paper gives the grounds for these changes by the example of a bridge. It allows calculating its optimum length with a specified reliability level of water runoff under the bridge.
The design and use of reliability data base with analysis tool
Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.
1996-06-01
With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.
Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.
2011-01-01
This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific ex
Principles of Bridge Reliability
Thoft-Christensen, Palle; Nowak, Andrzej S.
The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated...
Hawaii Electric System Reliability
Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2012-08-01
This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.
Hawaii electric system reliability.
Silva Monroy, Cesar Augusto; Loose, Verne William
2012-09-01
This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.
Contemporary Treatment of Reliability and Validity in Educational Assessment
Dimitrov, Dimiter M.
2010-01-01
The focus of this presidential address is on the contemporary treatment of reliability and validity in educational assessment. Highlights on reliability are provided under the classical true-score model using tools from latent trait modeling to clarify important assumptions and procedures for reliability estimation. In addition to reliability,…
Applying reliability models to the maintenance of Space Shuttle software
Schneidewind, Norman F.
1992-01-01
Software reliability models provide the software manager with a powerful tool for predicting, controlling, and assessing the reliability of software during maintenance. We show how a reliability model can be effectively employed for reliability prediction and the development of maintenance strategies using the Space Shuttle Primary Avionics Software Subsystem as an example.
Contemporary Treatment of Reliability and Validity in Educational Assessment
Dimitrov, Dimiter M.
2010-01-01
The focus of this presidential address is on the contemporary treatment of reliability and validity in educational assessment. Highlights on reliability are provided under the classical true-score model using tools from latent trait modeling to clarify important assumptions and procedures for reliability estimation. In addition to reliability,…
Software Reliability Cases: The Bridge Between Hardware, Software and System Safety and Reliability
Herrmann, D.S.; Peercy, D.E.
1999-01-08
High integrity/high consequence systems must be safe and reliable; hence it is only logical that both software safety and software reliability cases should be developed. Risk assessments in safety cases evaluate the severity of the consequences of a hazard and the likelihood of it occurring. The likelihood is directly related to system and software reliability predictions. Software reliability cases, as promoted by SAE JA 1002 and 1003, provide a practical approach to bridge the gap between hardware reliability, software reliability, and system safety and reliability by using a common methodology and information structure. They also facilitate early insight into whether or not a project is on track for meeting stated safety and reliability goals, while facilitating an informed assessment by regulatory and/or contractual authorities.
Statistical test theory for the behavioral sciences
de Gruijter, Dato N M
2007-01-01
Since the development of the first intelligence test in the early 20th century, educational and psychological tests have become important measurement techniques to quantify human behavior. Focusing on this ubiquitous yet fruitful area of research, Statistical Test Theory for the Behavioral Sciences provides both a broad overview and a critical survey of assorted testing theories and models used in psychology, education, and other behavioral science fields. Following a logical progression from basic concepts to more advanced topics, the book first explains classical test theory, covering true score, measurement error, and reliability. It then presents generalizability theory, which provides a framework to deal with various aspects of test scores. In addition, the authors discuss the concept of validity in testing, offering a strategy for evidence-based validity. In the two chapters devoted to item response theory (IRT), the book explores item response models, such as the Rasch model, and applications, incl...
A Statistical Approach to Provide Individualized Privacy for Surveys.
Esponda, Fernando; Huerta, Kael; Guerrero, Victor M
2016-01-01
In this paper we propose an instrument for collecting sensitive data that allows for each participant to customize the amount of information that she is comfortable revealing. Current methods adopt a uniform approach where all subjects are afforded the same privacy guarantees; however, privacy is a highly subjective property with intermediate points between total disclosure and non-disclosure: each respondent has a different criterion regarding the sensitivity of a particular topic. The method we propose empowers respondents in this respect while still allowing for the discovery of interesting findings through the application of well-known inferential procedures.
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Combined HW/SW Reliability Models.
1982-04-01
Stone, C. J. (1972). Introduction to Stochastic Processes . New York: Houghton Mifflin. Jelinski, Z. and Moranda, P. (1972). Software reliability...research. Statistical Computer Performance Evaluation, New York: Academic Press, 465-484. Kannan, D. (1979). An Introduction to Stochastic Processes . New
Reliability and Availability of Cloud Computing
Bauer, Eric
2012-01-01
A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le
Reliability of large and complex systems
Kolowrocki, Krzysztof
2014-01-01
Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt
Reports on internet traffic statistics
Hoogesteger, Martijn; Oliveira Schmidt, de Ricardo; Sperotto, Anna; Pras, Aiko
2013-01-01
Internet traffic statistics can provide valuable information to network analysts and researchers about the way nowadays networks are used. In the past, such information was provided by Internet2 in a public website called Internet2 NetFlow: Weekly Reports. The website reported traffic statistics fro
Reports on internet traffic statistics
Hoogesteger, Martijn; de Oliveira Schmidt, R.; Sperotto, Anna; Pras, Aiko
2013-01-01
Internet traffic statistics can provide valuable information to network analysts and researchers about the way nowadays networks are used. In the past, such information was provided by Internet2 in a public website called Internet2 NetFlow: Weekly Reports. The website reported traffic statistics
Solid State Lighting Reliability Components to Systems
Fan, XJ
2013-01-01
Solid State Lighting Reliability: Components to Systems begins with an explanation of the major benefits of solid state lighting (SSL) when compared to conventional lighting systems including but not limited to long useful lifetimes of 50,000 (or more) hours and high efficacy. When designing effective devices that take advantage of SSL capabilities the reliability of internal components (optics, drive electronics, controls, thermal design) take on critical importance. As such a detailed discussion of reliability from performance at the device level to sub components is included as well as the integrated systems of SSL modules, lamps and luminaires including various failure modes, reliability testing and reliability performance. This book also: Covers the essential reliability theories and practices for current and future development of Solid State Lighting components and systems Provides a systematic overview for not only the state-of-the-art, but also future roadmap and perspectives of Solid State Lighting r...
Mission Reliability Estimation for Repairable Robot Teams
Stephen B. Stancliff
2008-11-01
Full Text Available Many of the most promising applications for mobile robots require very high reliability. The current generation of mobile robots is, for the most part, highly unreliable. The few mobile robots that currently demonstrate high reliability achieve this reliability at a high financial cost. In order for mobile robots to be more widely used, it will be necessary to find ways to provide high mission reliability at lower cost. Comparing alternative design paradigms in a principled way requires methods for comparing the reliability of different robot and robot team configurations. In this paper, we present the first principled quantitative method for performing mission reliability estimation for mobile robot teams. We also apply this method to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Using conservative estimates of the cost-reliability relationship, our results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares.
Validation of Land Cover Products Using Reliability Evaluation Methods
Wenzhong Shi
2015-06-01
Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.
Basics of modern mathematical statistics
Spokoiny, Vladimir
2015-01-01
This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.
Photovoltaic system reliability
Maish, A.B.; Atcitty, C. [Sandia National Labs., NM (United States); Greenberg, D. [Ascension Technology, Inc., Lincoln Center, MA (United States)] [and others
1997-10-01
This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.
Structural Reliability Methods
Ditlevsen, Ove Dalager; Madsen, H. O.
of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature......The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Elementary statistical physics
Kittel, Charles
2004-01-01
Noteworthy for the philosophical subtlety of its foundations and the elegance of its problem-solving methods, statistical mechanics can be employed in a broad range of applications - among them, astrophysics, biology, chemistry, nuclear and solid state physics, communications engineering, metallurgy, and mathematics. Geared toward graduate students in physics, this text covers such important topics as stochastic processes and transport theory in order to provide students with a working knowledge of statistical mechanics.To explain the fundamentals of his subject, the author uses the method of
Improved reliability of power modules
Baker, Nick; Liserre, Marco; Dupont, Laurent
2014-01-01
Power electronic systems play an increasingly important role in providing high-efficiency power conversion for adjustable-speed drives, power-quality correction, renewable-energy systems, energy-storage systems, and electric vehicles. However, they are often presented with demanding operating env...... temperature cycling conditions on the system. On the other hand, safety requirements in the aerospace and automotive industries place rigorous demands on reliability....
MOV reliability evaluation and periodic verification scheduling
Bunte, B.D.
1996-12-01
The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.
Operational reliability of standby safety systems
Grant, G.M.; Atwood, C.L.; Gentillon, C.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others
1995-04-01
The Idaho National Engineering Laboratory (INEL) is evaluating the operational reliability of several risk-significant standby safety systems based on the operating experience at US commercial nuclear power plants from 1987 through 1993. The reliability assessed is the probability that the system will perform its Probabilistic Risk Assessment (PRA) defined safety function. The quantitative estimates of system reliability are expected to be useful in risk-based regulation. This paper is an overview of the analysis methods and the results of the high pressure coolant injection (HPCI) system reliability study. Key characteristics include (1) descriptions of the data collection and analysis methods, (2) the statistical methods employed to estimate operational unreliability, (3) a description of how the operational unreliability estimates were compared with typical PRA results, both overall and for each dominant failure mode, and (4) a summary of results of the study.
N. A. Nayak
1960-05-01
Full Text Available The reliability aspect of electronic equipment's is discussed. To obtain optimum results, close cooperation between the components engineer, the design engineer and the production engineer is suggested.
Reliability prediction techniques
Whittaker, B.; Worthington, B.; Lord, J.F.; Pinkard, D.
1986-01-01
The paper demonstrates the feasibility of applying reliability assessment techniques to mining equipment. A number of techniques are identified and described and examples of their use in assessing mining equipment are given. These techniques include: reliability prediction; failure analysis; design audit; maintainability; availability and the life cycle costing. Specific conclusions regarding the usefulness of each technique are outlined. The choice of techniques depends upon both the type of equipment being assessed and its stage of development, with numerical prediction best suited for electronic equipment and fault analysis and design audit suited to mechanical equipment. Reliability assessments involve much detailed and time consuming work but it has been demonstrated that the resulting reliability improvements lead to savings in service costs which more than offset the cost of the evaluation.
Reliability in individual monitoring service.
Mod Ali, N
2011-03-01
As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country.
Reliability of power connections
BRAUNOVIC Milenko
2007-01-01
Despite the use of various preventive maintenance measures, there are still a number of problem areas that can adversely affect system reliability. Also, economical constraints have pushed the designs of power connections closer to the limits allowed by the existing standards. The major parameters influencing the reliability and life of Al-Al and Al-Cu connections are identified. The effectiveness of various palliative measures is determined and the misconceptions about their effectiveness are dealt in detail.
Reliable design of electronic equipment an engineering guide
Natarajan, Dhanasekharan
2014-01-01
This book explains reliability techniques with examples from electronics design for the benefit of engineers. It presents the application of de-rating, FMEA, overstress analyses and reliability improvement tests for designing reliable electronic equipment. Adequate information is provided for designing computerized reliability database system to support the application of the techniques by designers. Pedantic terms and the associated mathematics of reliability engineering discipline are excluded for the benefit of comprehensiveness and practical applications. This book offers excellent support
Forster, Malcolm R
2011-01-01
Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.
The Monte Carlo Simulation Method for System Reliability and Risk Analysis
Zio, Enrico
2013-01-01
Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling. Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques. This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...
Nonparametric statistics for social and behavioral sciences
Kraska-MIller, M
2013-01-01
Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency
Multidisciplinary System Reliability Analysis
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
POSSIBILITY AND EVIDENCE-BASED RELIABILITY ANALYSIS AND DESIGN OPTIMIZATION
Hong-Zhong Huang
2013-01-01
Full Text Available Engineering design under uncertainty has gained considerable attention in recent years. A great multitude of new design optimization methodologies and reliability analysis approaches are put forth with the aim of accommodating various uncertainties. Uncertainties in practical engineering applications are commonly classified into two categories, i.e., aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises because of unpredictable variation in the performance and processes of systems, it is irreducible even adding more data or knowledge. On the other hand, epistemic uncertainty stems from lack of knowledge of the system due to limited data, measurement limitations, or simplified approximations in modeling system behavior and it can be reduced by obtaining more data or knowledge. More specifically, aleatory uncertainty is naturally represented by a statistical distribution and its associated parameters can be characterized by sufficient data. If, however, the data is limited and can be quantified in a statistical sense, epistemic uncertainty can be considered as an alternative tool in such a situation. Of the several optional treatments for epistemic uncertainty, possibility theory and evidence theory have proved to be the most computationally efficient and stable for reliability analysis and engineering design optimization. This study first attempts to provide a better understanding of uncertainty in engineering design by giving a comprehensive overview of its classifications, theories and design considerations. Then a review is conducted of general topics such as the foundations and applications of possibility theory and evidence theory. This overview includes the most recent results from theoretical research, computational developments and performance improvement of possibility theory and evidence theory with an emphasis on revealing the capability and characteristics of quantifying uncertainty from different perspectives
Sensitivity Analysis of Component Reliability
ZhenhuaGe
2004-01-01
In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.
System reliability analysis for kinematic performance of planar mechanisms
ZHANG YiMin; HUANG XianZhen; ZHANG XuFang; HE XiangDong; WEN BangChun
2009-01-01
Based on the reliability and mechanism kinematic accuracy theories, we propose a general methodology for system reliability analysis of kinematic performance of planar mechanisms. The loop closure equations are used to estimate the kinematic performance errors of planar mechanisms. Reliability and system reliability theories are introduced to develop the limit state functions (LSF) for failure of kinematic performance qualities. The statistical fourth moment method and the Edgeworth series technique are used on system reliability analysis for kinematic performance of planar mechanisms, which relax the restrictions of probability distribution of design variables. Finally, the practicality, efficiency and accuracy of the proposed method are demonstrated by numerical examples.
S. V. Myamlin
2015-09-01
Full Text Available Purpose. Scientific paper is aimed at disclosing the existing problem of the «reliability» definition and provides a reasoned definition of the term. The study assumes the development of a complex that includes the methodology and appropriate terminology that will be true. Methodology. Currently there is reliability theory in the special case of the probabilistic and statistical theory, which is used to determine or predict the occurrence of object failures. Within the existing theories the term «reliability» is formulated, the parameters related to it are described. Findings. On the basis of the conducted analysis of the research within the existing theory of reliability and personal considerations of the authors, they formulated a definition of the term «reliability». The methodology and study of the object reliability on the example of a freight car was proposed. Originality. The authors proposed a new definition of the «reliability» term. Namely: reliability is an assessment of the ability of an object to maintain the original property within the established limits and temporary space, in the conditions of storage and transportation, as well as to perform the required functions in predetermined modes of operation with maintenance and repair facility. The reliability parameters include: reliability, durability, maintainability and safety that characterize the object. Also a methodology for developing and study the reliability of a freight car was developed, which comprises: 1 design, technological and operational reliability; 2 scientific experiment, including the modeling of the freight car in different conditions and operating modes; 3 the reliability theory is presented as combined or modified, covering the mathematical and physical foundations; 4 Bayesian statistics that describe the different states of a freight car with a breakdown of its basic components and with the appropriate probability for each component to describe the
Estimation of measurement variance in the context of environment statistics
Maiti, Pulakesh
2015-02-01
The object of environment statistics is for providing information on the environment, on its most important changes over time, across locations and identifying the main factors that influence them. Ultimately environment statistics would be required to produce higher quality statistical information. For this timely, reliable and comparable data are needed. Lack of proper and uniform definitions, unambiguous classifications pose serious problems to procure qualitative data. These cause measurement errors. We consider the problem of estimating measurement variance so that some measures may be adopted to improve upon the quality of data on environmental goods and services and on value statement in economic terms. The measurement technique considered here is that of employing personal interviewers and the sampling considered here is that of two-stage sampling.
Complexity of software trustworthiness and its dynamical statistical analysis methods
ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing
2009-01-01
Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.
Causality Statistical Perspectives and Applications
Berzuini, Carlo; Bernardinell, Luisa
2012-01-01
A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr
Statistical methods for ranking data
Alvo, Mayer
2014-01-01
This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.
Demand Response For Power System Reliability: FAQ
Kirby, Brendan J [ORNL
2006-12-01
Demand response is the most underutilized power system reliability resource in North America. Technological advances now make it possible to tap this resource to both reduce costs and improve. Misconceptions concerning response capabilities tend to force loads to provide responses that they are less able to provide and often prohibit them from providing the most valuable reliability services. Fortunately this is beginning to change with some ISOs making more extensive use of load response. This report is structured as a series of short questions and answers that address load response capabilities and power system reliability needs. Its objective is to further the use of responsive load as a bulk power system reliability resource in providing the fastest and most valuable ancillary services.
Analysis of the Kinematic Accuracy Reliability of a 3-DOF Parallel Robot Manipulator
Guohua Cui
2015-02-01
Full Text Available Kinematic accuracy reliability is an important performance index in the evaluation of mechanism quality. By using a 3- DOF 3-PUU parallel robot manipulator as the research object, the position and orientation error model was derived by mapping the relation between the input and output of the mechanism. Three error sensitivity indexes that evaluate the kinematic accuracy of the parallel robot manipulator were obtained by adapting the singular value decomposition of the error translation matrix. Considering the influence of controllable and uncontrollable factors on the kinematic accuracy, the mathematical model of reliability based on random probability was employed. The measurement and calculation method for the evaluation of the mechanism’s kinematic reliability level was also provided. By analysing the mechanism’s errors and reliability, the law of surface error sensitivity for the location and structure parameters was obtained. The kinematic reliability of the parallel robot manipulator was statistically computed on the basis of the Monte Carlo simulation method. The reliability analysis of kinematic accuracy provides a theoretical basis for design optimization and error compensation.
A reliability measure of protein-protein interactions and a reliability measure-based search engine.
Park, Byungkyu; Han, Kyungsook
2010-02-01
Many methods developed for estimating the reliability of protein-protein interactions are based on the topology of protein-protein interaction networks. This paper describes a new reliability measure for protein-protein interactions, which does not rely on the topology of protein interaction networks, but expresses biological information on functional roles, sub-cellular localisations and protein classes as a scoring schema. The new measure is useful for filtering many spurious interactions, as well as for estimating the reliability of protein interaction data. In particular, the reliability measure can be used to search protein-protein interactions with the desired reliability in databases. The reliability-based search engine is available at http://yeast.hpid.org. We believe this is the first search engine for interacting proteins, which is made available to public. The search engine and the reliability measure of protein interactions should provide useful information for determining proteins to focus on.
Power Quality and Reliability Project
Attia, John O.
2001-01-01
One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.
Reliability of Arctic offshore installations
Bercha, F.G. [Bercha Group, Calgary, AB (Canada); Gudmestad, O.T. [Stavanger Univ., Stavanger (Norway)]|[Statoil, Stavanger (Norway)]|[Norwegian Univ. of Technology, Stavanger (Norway); Foschi, R. [British Columbia Univ., Vancouver, BC (Canada). Dept. of Civil Engineering; Sliggers, F. [Shell International Exploration and Production, Rijswijk (Netherlands); Nikitina, N. [VNIIG, St. Petersburg (Russian Federation); Nevel, D.
2006-11-15
Life threatening and fatal failures of offshore structures can be attributed to a broad range of causes such as fires and explosions, buoyancy losses, and structural overloads. This paper addressed the different severities of failure types, categorized as catastrophic failure, local failure or serviceability failure. Offshore tragedies were also highlighted, namely the failures of P-36, the Ocean Ranger, the Piper Alpha, and the Alexander Kieland which all resulted in losses of human life. P-36 and the Ocean Ranger both failed ultimately due to a loss of buoyancy. The Piper Alpha was destroyed by a natural gas fire, while the Alexander Kieland failed due to fatigue induced structural failure. The mode of failure was described as being the specific way in which a failure occurs from a given cause. Current reliability measures in the context of offshore installations only consider the limited number of causes such as environmental loads. However, it was emphasized that a realistic value of the catastrophic failure probability should consider all credible causes of failure. This paper presented a general method for evaluating all credible causes of failure of an installation. The approach to calculating integrated reliability involves the use of network methods such as fault trees to combine the probabilities of all factors that can cause a catastrophic failure, as well as those which can cause a local failure with the potential to escalate to a catastrophic failure. This paper also proposed a protocol for setting credible reliability targets such as the consideration of life safety targets and escape, evacuation, and rescue (EER) success probabilities. A set of realistic reliability targets for both catastrophic and local failures for representative safety and consequence categories associated with offshore installations was also presented. The reliability targets were expressed as maximum average annual failure probabilities. The method for converting these annual
Maximum phonation time: variability and reliability.
Speyer, Renée; Bogaardt, Hans C A; Passos, Valéria Lima; Roodenburg, Nel P H D; Zumach, Anne; Heijnen, Mariëlle A M; Baijens, Laura W J; Fleskens, Stijn J H M; Brunings, Jan W
2010-05-01
The objective of the study was to determine maximum phonation time reliability as a function of the number of trials, days, and raters in dysphonic and control subjects. Two groups of adult subjects participated in this reliability study: a group of outpatients with functional or organic dysphonia versus a group of healthy control subjects matched by age and gender. Over a period of maximally 6 weeks, three video recordings were made of five subjects' maximum phonation time trials. A panel of five experts were responsible for all measurements, including a repeated measurement of the subjects' first recordings. Patients showed significantly shorter maximum phonation times compared with healthy controls (on average, 6.6 seconds shorter). The averaged interclass correlation coefficient (ICC) over all raters per trial for the first day was 0.998. The averaged reliability coefficient per rater and per trial for repeated measurements of the first day's data was 0.997, indicating high intrarater reliability. The mean reliability coefficient per day for one trial was 0.939. When using five trials, the reliability increased to 0.987. The reliability over five trials for a single day was 0.836; for 2 days, 0.911; and for 3 days, 0.935. To conclude, the maximum phonation time has proven to be a highly reliable measure in voice assessment. A single rater is sufficient to provide highly reliable measurements.
Telecommunications system reliability engineering theory and practice
Ayers, Mark L
2012-01-01
"Increasing system complexity require new, more sophisticated tools for system modeling and metric calculation. Bringing the field up to date, this book provides telecommunications engineers with practical tools for analyzing, calculating, and reporting availability, reliability, and maintainability metrics. It gives the background in system reliability theory and covers in-depth applications in fiber optic networks, microwave networks, satellite networks, power systems, and facilities management. Computer programming tools for simulating the approaches presented, using the Matlab software suite, are also provided"
Gearbox Reliability Collaborative (GRC) Description and Loading
Oyague, F.
2011-11-01
This document describes simulated turbine load cases in accordance to the IEC 61400-1 Ed.3 standard, which is representative of the typical wind turbine design process. The information presented herein is intended to provide a broad understanding of the gearbox reliability collaborative 750kW drivetrain and turbine configuration. In addition, fatigue and ultimate strength drivetrain loads resulting from simulations are presented. This information provides the bases for the analytical work of the gearbox reliability collaborative effort.
Reliability and qualification of advanced microelectronics for space applications
Kayali, S.
2003-01-01
This paper provides a discussion of the subject and an approach to establish a reliability and qualification methodology to facilitate the utilization of state-of-the-art advanced microelectronic devices and structures in high reliability applications.
Basic statistics in cell biology.
Vaux, David L
2014-01-01
The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.
Babor, Thomas F; Xuan, Ziming; Proctor, Dwayne
2008-03-01
The purposes of this study were to develop reliable procedures to monitor the content of alcohol advertisements broadcast on television and in other media, and to detect violations of the content guidelines of the alcohol industry's self-regulation codes. A set of rating-scale items was developed to measure the content guidelines of the 1997 version of the U.S. Beer Institute Code. Six focus groups were conducted with 60 college students to evaluate the face validity of the items and the feasibility of the procedure. A test-retest reliability study was then conducted with 74 participants, who rated five alcohol advertisements on two occasions separated by 1 week. Average correlations across all advertisements using three reliability statistics (r, rho, and kappa) were almost all statistically significant and the kappas were good for most items, which indicated high test-retest agreement. We also found high interrater reliabilities (intraclass correlations) among raters for item-level and guideline-level violations, indicating that regardless of the specific item, raters were consistent in their general evaluations of the advertisements. Naïve (untrained) raters can provide consistent (reliable) ratings of the main content guidelines proposed in the U.S. Beer Institute Code. The rating procedure may have future applications for monitoring compliance with industry self-regulation codes and for conducting research on the ways in which alcohol advertisements are perceived by young adults and other vulnerable populations.
THE AIRLINE'S RELIABILITY PROGRAM
Тамаргазін, О. А.; Національний авіаційний університет; Власенко, П. О.; Національний авіаційний університет
2013-01-01
Airline's operational structure for Reliability program implementation — engineering division, reliability division, reliability control division, aircraft maintenance division, quality assurance division — was considered. Airline's Reliability program structure is shown. Using of Reliability program for reducing costs on aircraft maintenance is proposed. Рассмотрена организационная структура авиакомпании по выполнению Программы надежности - инженерный отдел, отделы по надежности авиацио...
Photovoltaic module reliability workshop
Mrig, L. (ed.)
1990-01-01
The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
Next Generation Reliable Transport Networks
Zhang, Jiang
of criticality and security, there are certain physical or logical segregation requirements between the avionic systems. Such segregations can be implemented on the proposed avionic networks with different hierarchies. In order to fulfill the segregation requirements, a tailored heuristic approach for solving......This thesis focuses the efforts on ensuring the reliability of transport networks and takes advantages and experiences from the transport networks into the networks for particular purposes. Firstly, the challenges of providing reliable multicast services on Multipath Label Switching......-Transport Profile (MPLS-TP) ring networks are addressed. Through the proposed protection structure and protection switching schemes, the recovery mechanism is enhanced in terms of recovery label consumption, operation simplicity and fine traffic engineering granularity. Furthermore, the extensions for existing...
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation
Aris Spanos
2011-01-01
Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.
An Investigation of Software metrics Affect on Cobol Program reliability
Day II, Henry Jesse
1996-01-01
The purpose of this research was to predict a COBOL program's reliability from software characteristics that are found in the program's source code. The first step was to select factors based on the human information processing model that are associated with changes in computer program reliability. Then these factors (software metrics) were quantitatively studied to determine which factors affect COBOL program reliability. Then a statistical model was developed that predicts COBOL program rel...
Reliability Analysis of Slope Stability by Central Point Method
Li, Chunge; WU Congliang
2015-01-01
Given uncertainty and variability of the slope stability analysis parameter, the paper proceed from the perspective of probability theory and statistics based on the reliability theory. Through the central point method of reliability analysis, performance function about the reliability of slope stability analysis is established. What’s more, the central point method and conventional limit equilibrium methods do comparative analysis by calculation example. The approach’s numerical ...
Technique for Measuring Hybrid Electronic Component Reliability
Green, C.C.; Hernandez, C.L.; Hosking, F.M.; Robinson, D.; Rutherford, B.; Uribe, F.
1999-01-01
Materials compatibility studies of aged, engineered materials and hardware are critical to understanding and predicting component reliability, particularly for systems with extended stockpile life requirements. Nondestructive testing capabilities for component reliability would significantly enhance lifetime predictions. For example, if the detection of crack propagation through a solder joint can be demonstrated, this technique could be used to develop baseline information to statistically determine solder joint lifelengths. This report will investigate high frequency signal response techniques for nondestructively evaluating the electrical behavior of thick film hybrid transmission lines.
MECHANICAL STRENGTH AND RELIABILITY OF SOLID CATALYSTS
Yongdan Li; Dongfang Wu; Y.S. Lin
2004-01-01
The mechanical strength of solid catalysts is one of the key parameters for reliable and efficient performance of a fixed bed reactor. Some recent developments and their basic mechanics within this context are reviewed. The main concepts discussed are brittle fracture which leads to the mechanical failure of the catalyst pellets, measurement and statistical properties of the catalyst strength data, and mechanical reliability of the catalyst pellets and their packed bed. The scientific basis for the issues on the catalyst mechanical properties calls yet for further elucidation and advancement.
Reliability analysis of DOOF for Weibull distribution
陈文华; 崔杰; 樊晓燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.
Selection of statistical distributions for prediction of steam generator tube degradation
Stavropoulos, K.D.; Gorman, J.A. [Dominion Engr., Inc., McLean, VA (United States); Staehle, R.W. [Univ. of Minnesota, Minneapolis, MN (United States); Welty, C.S. Jr. [Electric Power Research Institute, Palo Alto, CA (United States)
1992-12-31
This paper presents the first part of a project directed at developing methods for characterizing and predicting the progression of degradation of PWR steam generator tubes. This first part covers the evaluation of statistical distributions for use in such analyses. The data used in the evaluation of statistical distributions included data for primary water stress corrosion cracking (PWSCC) at roll transitions and U-bends, and intergranular attack/stress corrosion cracking (IGA/SCC) at tube sheet and tube support plate crevices. Laboratory data for PWSCC of reverse U-bends were also used. The review of statistical distributions indicated that the Weibull distribution provides an easy to use and effective method. Another statistical function, the log-normal, was found to provide essentially equivalent results. Two parameter fits, without an initiation time, were found to provide the most reliable predictions.
Computer System Reliability Allocation Method and Supporting Tool
无
2001-01-01
This paper presents a computer system reliability allocationmethod that is based on the theory of statistic and Markovian chain,which can be used to allocate reliability to subsystem, to hybrid system and software modules. Arele vant supporting tool built by us is introduced.
System Reliability Analysis: Foundations.
1982-07-01
performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash
... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
Register-based statistics statistical methods for administrative data
Wallgren, Anders
2014-01-01
This book provides a comprehensive and up to date treatment of theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi
Troyan, V.N.
1982-01-01
For the first time materials are generalized concerning statistical methods of processing seismic information which are used more widely in prospecting minerals (oil, gas, ore) in regions of complex structure and great depths. The methods provide reliable identification of useful signals in the background of interferences. Fundamentals are examined of methods of constructing algorithms and programs used in interpretation, and their efficiency.
On controversial statistical issues in clinical research
Chow SC
2015-05-01
Full Text Available Shein-Chung Chow,1 Fuyu Song2 1Duke University School of Medicine, Durham, NC, USA; 2Peking University Clinical Research Institute, Peking University Health Science Center, Beijing, People's Republic of China Abstract: In clinical development of a test treatment under investigation, clinical trials are often conducted for evaluation of safety and efficacy of the test treatment. To provide an accurate and reliable assessment, adequate and well-controlled clinical trials using valid study designs are necessarily conducted for obtaining substantial evidence of safety and efficacy of the test treatment under investigation. In practice, however, some debatable issues are commonly encountered regardless compliance with good statistics practice and good clinical practice. These issues include, but are not limited to: 1 appropriateness of statistical hypotheses for clinical investigation; 2 correctness of power analysis assumptions; 3 integrity of randomization and blinding; 4 post hoc endpoint selection; 5 impact of protocol amendments on the characteristics of the trial population; 6 multiplicity in clinical trials; 7 missing data imputation; 8 adaptive design methods; and 9 independence of a data monitoring committee. In this article, these issues are briefly described. The impact of these issues on the evaluation of the safety and efficacy of the test treatment under investigation are discussed with examples whenever applicable. Some recommendations regarding possible resolutions of these issues are also provided. Keywords: data safety monitoring committee, endpoint selection, integrity of blinding, missing data imputation, multiplicity, protocol amendment, two-stage adaptive designs
RELIABILITY BASED DESIGN OF A GEAR BOX
D.MADHUSEKHAR
2014-08-01
Full Text Available Reliability is the probability that a system, component or device will perform without failure for a specified period of time under specified operating conditions. The concept of reliability is of great importance in the design of various machine members. Conventional engineering design uses a deterministic approach. It disregards the fact that the material properties, the dimensions of the components and the externally applied loads are statistical in nature. In conventional design this uncertainties are covered with a factor of safety, which is not always successful. The growing trend towards reducing uncertainty and increasing reliability is to use the probabilistic approach. In the present work a three shaft four speed gear box and six speed gear box are designed using reliability principles. For the specified reliability of the system (Gear box, component reliability (Gear pair is calculated by considering the system as a series system. Design is considered to be safe and adequate if the probability of failure of gear box is less than or equal to a specified quantity in each of the two failure modes. . All the parameters affecting the design are considered as random variables and all the random variables are assumed to follow normal distribution. A computer program in C++ is developed to calculate the face widths in bending and surface failure modes. The larger one out of the two values is considered. By changing the variations in the design parameters, variations in the face widths are studied.
Zemstvo Statistics on Public Education.
Abramov, V. F.
1997-01-01
Surveys the general organizational principles and forms of keeping the zemstvo (regional) statistics on Russian public education. Conveys that they were subdivided into three types: (1) the current statistics that continuously monitored schools; (2) basic surveys that provided a comprehensive characterization of a given territory's public…
A Statistical Framework for the Functional Analysis of Metagenomes
Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.
2008-10-01
Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.
Reliability based structural design
Vrouwenvelder, A.C.W.M.
2014-01-01
According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke
Reliability based structural design
Vrouwenvelder, A.C.W.M.
2013-01-01
According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke
Fosgerau, Mogens; Karlström, Anders
2010-01-01
We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...
Parametric Mass Reliability Study
Holt, James P.
2014-01-01
The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.
Avionics Design for Reliability
1976-03-01
Consultant P.O. Box 181, Hazelwood. Missouri 63042, U.S.A. soup ""•.• • CONTENTS Page LIST OF SPEAKERS iii INTRODUCTION AND OVERVIEW-RELIABILITY UNDER... primordial , d’autant plus quo dans co cam ia procg- dure do st~lection en fiabilitg eat assez peu efficaco. La ripartition des pannes suit
1980-01-01
The reliability of a wind energy system depends on the size of the propeller and the size of the back-up energy storage. Design of the optimum system...speed incidents which generate a significant part of the wind energy . A nomogram is presented, based on some continuous wind speed measurements
Visser, M
1997-01-01
The ``reliability horizon'' for semi-classical quantum gravity quantifies the extent to which we should trust semi-classical quantum gravity, and gives a handle on just where the ``Planck regime'' resides. The key obstruction to pushing semi-classical quantum gravity into the Planck regime is often the existence of large metric fluctuations, rather than a large back-reaction.
Reliability of semiology description.
Heo, Jae-Hyeok; Kim, Dong Wook; Lee, Seo-Young; Cho, Jinwhan; Lee, Sang-Kun; Nam, Hyunwoo
2008-01-01
Seizure semiology is important for classifying patients' epilepsy. Physicians usually get most of the seizure information from observers though there have been few reports on the reliability of the observers' description. This study aims at determining the reliability of observers' description of the semiology. We included 92 patients who had their habitual seizures recorded during video-EEG monitoring. We compared the semiology described by the observers with that recorded on the videotape, and reviewed which characteristics of the observers affected the reliability of their reported data. The classification of seizures and the individual components of the semiology based only on the observer-description was somewhat discordant compared with the findings from the videotape (correct classification, 85%). The descriptions of some ictal behaviors such as oroalimentary automatism, tonic/dystonic limb posturing, and head versions were relatively accurate, but those of motionless staring and hand automatism were less accurate. The specified directions by the observers were relatively correct. The accuracy of the description was related to the educational level of the observers. Much of the information described by well-educated observers is reliable. However, every physician should keep in mind the limitations of this information and use this information cautiously.
High reliability organizations
Gallis, R.; Zwetsloot, G.I.J.M.
2014-01-01
High Reliability Organizations (HRO’s) are organizations that constantly face serious and complex (safety) risks yet succeed in realising an excellent safety performance. In such situations acceptable levels of safety cannot be achieved by traditional safety management only. HRO’s manage safety
Reliability physics and engineering time-to-failure modeling
McPherson, J W
2013-01-01
Reliability Physics and Engineering provides critically important information that is needed for designing and building reliable cost-effective products. Key features include: · Materials/Device Degradation · Degradation Kinetics · Time-To-Failure Modeling · Statistical Tools · Failure-Rate Modeling · Accelerated Testing · Ramp-To-Failure Testing · Important Failure Mechanisms for Integrated Circuits · Important Failure Mechanisms for Mechanical Components · Conversion of Dynamic Stresses into Static Equivalents · Small Design Changes Producing Major Reliability Improvements · Screening Methods · Heat Generation and Dissipation · Sampling Plans and Confidence Intervals This textbook includes numerous example problems with solutions. Also, exercise problems along with the answers are included at the end of each chapter. Relia...
Objectivity, Reliability, and Validity of Search Engine Count Estimates
Dietmar Janetzko
2008-01-01
Full Text Available Count estimates ("hits" provided by Web search engines have received much attention as a yardstick to measure a variety of phenomena of interest as diverse as, e.g., language statistics, popularity of authors, or similarity between words. Common to these activities is the intention to use Web search engines not only for search but for ad hoc measurement. Using search engine count estimates (SECEs in this way means that a phenomenon of interest, e.g., the popularity of an author, is conceived of as a measurand, and SECEs are taken to be its quantitative measures. However, the data quality of SECEs has not yet been studied systematically, and concerns have been raised against the use of this kind of data. This article examines the data quality of SECEs focusing on classical goodness criteria, i.e., objectivity, reliability, and validity. The results of a series of studies indicate that with the exception of Boolean queries that use disjunction or negation objectivity as well as test-retest reliability and parallel-test reliability of SECEs is good for most types of browsers and search engines examined. Estimation of validity required model development (all-subsets regression revealing satisfying results by using an explorative approach to feature selection. The ﬁndings are discussed in the light of previous objections and perspectives for using Web search count estimates are delineated.
Mechanical reliability analysis of tubes intended for hydrocarbons
Nahal, Mourad; Khelif, Rabia [Badji Mokhtar University, Annaba (Algeria)
2013-02-15
Reliability analysis constitutes an essential phase in any study concerning reliability. Many industrialists evaluate and improve the reliability of their products during the development cycle - from design to startup (design, manufacture, and exploitation) - to develop their knowledge on cost/reliability ratio and to control sources of failure. In this study, we obtain results for hardness, tensile, and hydrostatic tests carried out on steel tubes for transporting hydrocarbons followed by statistical analysis. Results obtained allow us to conduct a reliability study based on resistance request. Thus, index of reliability is calculated and the importance of the variables related to the tube is presented. Reliability-based assessment of residual stress effects is applied to underground pipelines under a roadway, with and without active corrosion. Residual stress has been found to greatly increase probability of failure, especially in the early stages of pipe lifetime.
USING STATISTICAL SURVEY IN ECONOMICS
Delia TESELIOS
2012-01-01
Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.
Methods of statistical model estimation
Hilbe, Joseph
2013-01-01
Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. Th
Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia
2012-09-25
The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.
Predict! Teaching Statistics Using Informational Statistical Inference
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Statistical analysis of management data
Gatignon, Hubert
2013-01-01
This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.
Reliability in the utility computing era: Towards reliable Fog computing
Madsen, Henrik; Burtschy, Bernard; Albeanu, G.
2013-01-01
This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....
Kowalski, Jeanne
2008-01-01
A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...
Measurement and statistics for teachers
Van Blerkom, Malcolm
2008-01-01
Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...
Improvement of the reliability on nondestructive inspection
Song, Sung Jin; Kim, Young H. [Sungkyunkwan Univ., Suwon (Korea, Republic of); Lee, Hyang Beom [Soongsil Univ., Seoul (Korea, Republic of); Shin, Young Kil [Kunsan National Univ., Gunsan (Korea, Republic of); Jung, Hyun Jo [Wonkwang Univ., Iksan (Korea, Republic of); Park, Ik Keun; Park, Eun Soo [Seoul Nationl Univ., Seoul (Korea, Republic of)
2002-03-15
Retaining reliabilities of nondestructive testing is essential for the life-time maintenance of Nuclear Power Plant. The nondestructive testing methods which are frequently used in the Nuclear Power Plant are eddy current testing for the inspection of steam generator tubes and ultrasonic testing for the inspection of weldments. In order to improve reliabilities of ultrasonic testing and eddy current testing, the subjects carried out in this study are as follows : development of BEM analysis technique for ECT of SG tube, development of neural network technique for the intelligent analysis of ECT flaw signals of SG tubes, development of RFECT technology for the inspection of SG tube, FEM analysis of ultrasonic scattering field, evaluation of statistical reliability of PD-RR test of ultrasonic testing and development of multi-Gaussian beam modeling technique to predict accurate signal of signal beam ultrasonic testing with the efficiency in calculation time.
Thacker, Michael A.; Moseley, G. Lorimer
2017-01-01
Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134
Search Databases and Statistics
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
the vast amounts of raw data. This task is tackled by computational tools implementing algorithms that match the experimental data to databases, providing the user with lists for downstream analysis. Several platforms for such automated interpretation of mass spectrometric data have been developed, each...... having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Engsted, Tom
reliable estimates, and I argue that significance tests are useful tools in those cases where a statistical model serves as input in the quantification of an economic model. Finally, I provide a specific example from economics - asset return predictability - where the distinction between statistical......I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...
Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial
Kevin A. Hallgren
2012-02-01
Full Text Available Many research designs require the assessment of inter-rater reliability (IRR to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohens kappa and intra-class correlations to assess IRR.
Reliability and Assessment Techniques on Ground Excavation
Sanga Tangchawal
2009-05-01
Full Text Available Planning and assessment on the excavation of the brittle materials (soil or rock can be done by using the machinery and/or explosives. The reliability assessment has been proposed to predict the failure of ground during excavation process. The stability planning on cutting soil (rock face by machinery can be compared between the deterministic and the statistical method. The risk of using explosives for rock excavation has to concern on the damage and environmental impacts after blasting events.
Semiconductor packaging materials interaction and reliability
Chen, Andrea
2012-01-01
In semiconductor manufacturing, understanding how various materials behave and interact is critical to making a reliable and robust semiconductor package. Semiconductor Packaging: Materials Interaction and Reliability provides a fundamental understanding of the underlying physical properties of the materials used in a semiconductor package. The book focuses on an important step in semiconductor manufacturing--package assembly and testing. It covers the basics of material properties and explains how to determine which behaviors are important to package performance. The authors also discuss how
Human Reliability Program Workshop
Landers, John; Rogers, Erin; Gerke, Gretchen
2014-05-18
A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.
Accelerator reliability workshop
Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D
2002-07-01
About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.
Improving Power Converter Reliability
Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon
2014-01-01
The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...
Power electronics reliability.
Kaplar, Robert James; Brock, Reinhard C.; Marinella, Matthew; King, Michael Patrick; Stanley, James K.; Smith, Mark A.; Atcitty, Stanley
2010-10-01
The project's goals are: (1) use experiments and modeling to investigate and characterize stress-related failure modes of post-silicon power electronic (PE) devices such as silicon carbide (SiC) and gallium nitride (GaN) switches; and (2) seek opportunities for condition monitoring (CM) and prognostics and health management (PHM) to further enhance the reliability of power electronics devices and equipment. CM - detect anomalies and diagnose problems that require maintenance. PHM - track damage growth, predict time to failure, and manage subsequent maintenance and operations in such a way to optimize overall system utility against cost. The benefits of CM/PHM are: (1) operate power conversion systems in ways that will preclude predicted failures; (2) reduce unscheduled downtime and thereby reduce costs; and (3) pioneering reliability in SiC and GaN.
Expansions and Asymptotics for Statistics
Small, Christopher G
2010-01-01
Providing a broad toolkit of analytical methods, this book shows how asymptotics, when coupled with numerical methods, becomes a powerful way to acquire a deeper understanding of the techniques used in probability and statistics. It describes core ideas in statistical asymptotics; covers Laplace approximation, the saddle-point method, and summation of series; and, includes vignettes of various people from statistics and mathematics whose ideas have been instrumental in the development of the subject. The author also supplements some topics with relevant Maplea commands and provides a list of c
Research design and statistical analysis
Myers, Jerome L; Lorch Jr, Robert F
2013-01-01
Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data. The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations. Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions. Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations
Scaled CMOS Technology Reliability Users Guide
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is
Reliability of indicators of nursing care quality: testing interexaminer agreement and reliability
Dagmar Willamowius Vituri
2014-04-01
Full Text Available OBJECTIVE: this study sought to test the interexaminer agreement and reliability of 15 indicators of nursing care quality.METHODS: this was a quantitative, methodological, experimental, and applied study conducted at a large, tertiary, public teaching hospital in the state of Paraná. For data analysis, the Kappa (k statistic was applied to the categorical variables - indicators 1 to 11 and 15 - and the interclass correlation coefficient (ICC to the continuous variables - indicators 12, 13, and 14, with the corresponding 95% confidence intervals. The categorical data were analyzed using the Lee software, elaborated by the Laboratory of Epidemiology and Statistics of Dante Pazzanese Institute of Cardiology - Brazil, and the continuous data were assessed using BioEstat 5.0.RESULTS: the k-statistic results indicated excellent agreement, which was statistically significant, and the values of the ICC denoted excellent and statistically significant reproducibility/agreement relative to the investigated indicators.CONCLUSION: the investigated indicators exhibited excellent reliability and reproducibility, thus showing that it is possible to formulate valid and reliable assessment instruments for the management of nursing care.
75 FR 81152 - Interpretation of Protection System Reliability Standard
2010-12-27
... Systems that affect the reliability of the BES. The program shall include: R1.1. Maintenance and testing... provide a complete framework for maintenance and testing of equipment necessary to ensure the reliability... maintenance and testing of Protection Systems affecting the reliability of the Bulk-Power System. 13. If...
Bartsch, R.R.
1995-09-01
Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.
Reliability of Circumplex Axes
Micha Strack
2013-06-01
Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.
Measurement Practices for Reliability and Power Quality
Kueck, JD
2005-05-06
This report provides a distribution reliability measurement ''toolkit'' that is intended to be an asset to regulators, utilities and power users. The metrics and standards discussed range from simple reliability, to power quality, to the new blend of reliability and power quality analysis that is now developing. This report was sponsored by the Office of Electric Transmission and Distribution, U.S. Department of Energy (DOE). Inconsistencies presently exist in commonly agreed-upon practices for measuring the reliability of the distribution systems. However, efforts are being made by a number of organizations to develop solutions. In addition, there is growing interest in methods or standards for measuring power quality, and in defining power quality levels that are acceptable to various industries or user groups. The problems and solutions vary widely among geographic areas and among large investor-owned utilities, rural cooperatives, and municipal utilities; but there is still a great degree of commonality. Industry organizations such as the National Rural Electric Cooperative Association (NRECA), the Electric Power Research Institute (EPRI), the American Public Power Association (APPA), and the Institute of Electrical and Electronics Engineers (IEEE) have made tremendous strides in preparing self-assessment templates, optimization guides, diagnostic techniques, and better definitions of reliability and power quality measures. In addition, public utility commissions have developed codes and methods for assessing performance that consider local needs. There is considerable overlap among these various organizations, and we see real opportunity and value in sharing these methods, guides, and standards in this report. This report provides a ''toolkit'' containing synopses of noteworthy reliability measurement practices. The toolkit has been developed to address the interests of three groups: electric power users, utilities, and
Statistical methods in translational medicine.
Chow, Shein-Chung; Tse, Siu-Keung; Lin, Min
2008-12-01
This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials), information (e.g. translation of basic discoveries to the clinic) and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints) in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physicianscientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change) are reviewed.
Statistical Methods in Translational Medicine
Shein-Chung Chow
2008-12-01
Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.
Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine
Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon
2013-01-01
as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...... diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction...
无
2002-01-01
The mechanical reliability and optimization theory on the method of reliability-optimization design for the new roller orientation clutch is provided. The result of reliability-optimization design is compared with the result of the conventional design method.
Designing for Reliability and Robustness
Svetlik, Randall G.; Moore, Cherice; Williams, Antony
2017-01-01
Long duration spaceflight has a negative effect on the human body, and exercise countermeasures are used on-board the International Space Station (ISS) to minimize bone and muscle loss, combatting these effects. Given the importance of these hardware systems to the health of the crew, this equipment must continue to be readily available. Designing spaceflight exercise hardware to meet high reliability and availability standards has proven to be challenging throughout the time the crewmembers have been living on ISS beginning in 2000. Furthermore, restoring operational capability after a failure is clearly time-critical, but can be problematic given the challenges of troubleshooting the problem from 220 miles away. Several best-practices have been leveraged in seeking to maximize availability of these exercise systems, including designing for robustness, implementing diagnostic instrumentation, relying on user feedback, and providing ample maintenance and sparing. These factors have enhanced the reliability of hardware systems, and therefore have contributed to keeping the crewmembers healthy upon return to Earth. This paper will review the failure history for three spaceflight exercise countermeasure systems identifying lessons learned that can help improve future systems. Specifically, the Treadmill with Vibration Isolation and Stabilization System (TVIS), Cycle Ergometer with Vibration Isolation and Stabilization System (CEVIS), and the Advanced Resistive Exercise Device (ARED) will be reviewed, analyzed, and conclusions identified so as to provide guidance for improving future exercise hardware designs. These lessons learned, paired with thorough testing, offer a path towards reduced system down-time.
Pewsey, Arthur; Ruxton, Graeme D
2013-01-01
Circular Statistics in R provides the most comprehensive guide to the analysis of circular data in over a decade. Circular data arise in many scientific contexts whether it be angular directions such as: observed compass directions of departure of radio-collared migratory birds from a release point; bond angles measured in different molecules; wind directions at different times of year at a wind farm; direction of stress-fractures in concretebridge supports; longitudes of earthquake epicentres or seasonal and daily activity patterns, for example: data on the times of day at which animals are c
Who Needs Statistics? | Poster
You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.
International petroleum statistics report
NONE
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
Statistical mechanics of learning
Engel, Andreas
2001-01-01
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
Who Needs Statistics? | Poster
You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.
Continuous Reliability Enhancement for Wind (CREW) database :
Hines, Valerie Ann-Peters; Ogilvie, Alistair B.; Bond, Cody R.
2013-09-01
To benchmark the current U.S. wind turbine fleet reliability performance and identify the major contributors to component-level failures and other downtime events, the Department of Energy funded the development of the Continuous Reliability Enhancement for Wind (CREW) database by Sandia National Laboratories. This report is the third annual Wind Plant Reliability Benchmark, to publically report on CREW findings for the wind industry. The CREW database uses both high resolution Supervisory Control and Data Acquisition (SCADA) data from operating plants and Strategic Power Systems ORAPWindª (Operational Reliability Analysis Program for Wind) data, which consist of downtime and reserve event records and daily summaries of various time categories for each turbine. Together, these data are used as inputs into CREWs reliability modeling. The results presented here include: the primary CREW Benchmark statistics (operational availability, utilization, capacity factor, mean time between events, and mean downtime); time accounting from an availability perspective; time accounting in terms of the combination of wind speed and generation levels; power curve analysis; and top system and component contributors to unavailability.
Small Area Estimation of Poverty Statistics
Villa Juan-Albacea, Zita
2009-01-01
In response to high demands for lower level poverty estimates, the National Statistical Coordination Board releases provincial estimates, in addition to the national and regional, starting with the 1997 FIES. However, estimates of the coefficients of variation (CV) of several provincial estimates indicate that the resulting poverty measures are not reliable. Making a decision based on unreliable poverty statistics is very risky especially if the decision to be made relates to the welfare of p...
Validity and Reliability in Social Science Research
Drost, Ellen A.
2011-01-01
In this paper, the author aims to provide novice researchers with an understanding of the general problem of validity in social science research and to acquaint them with approaches to developing strong support for the validity of their research. She provides insight into these two important concepts, namely (1) validity; and (2) reliability, and…
Statistical Yearbook of Norway 2012
NONE
2012-07-01
The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)
Improved model for statistical alignment
Miklos, I.; Toroczkai, Z. (Zoltan)
2001-01-01
The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...
The Statistical Drake Equation
Maccone, Claudio
2010-12-01
We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density
Lifetimes and Reliabilities of Bevel-Gear Drive Trains
Lewicki, D.; Cox, J.; Savage, M.; Brikmanis, C.
1986-01-01
Statistical methods used to predict system lifetimes from component lifetimes. Report shows how to use information to determine system life of drive train, using methods of probability and statistics. Presents life and reliability model for bevel-gear drive trains. Bevel-gear and support-bearing lives analyzed for each gear and bearing in drive train, with results statistically combined to produce system life for entire drive train. Numerical example included.
Power Industry Reliability Coordination in Asia in a Market Environment
Nikolai I. Voropai
2010-03-01
Full Text Available This paper addresses the problems of power supply reliability in a market environment. The specific features of economic interrelations between the power supply organization and consumers in terms of reliability assurance are examined and the principles of providing power supply reliability are formulated. The economic mechanisms of coordinating the interests of power supply organization and consumers to provide power supply reliability are discussed. Reliability of restructuring China's power industry is introduced. Some reliability data is provided. The data shows that the reliability level has increased significantly in the past two decades. More and more measures are being applied to guarantee reliability of the restructured power systems. The reliability issues and challenges that are facing the Chinese power industry are considered The paper, then examines the evolution of power grids in India, the establishment of a regulatory framework, and operational philosophy in reliability aspects of long-, mid- as well as short-term (operational / outage planning. Grid security, restoration, and mock trial for black start, etc. from the reliability angle are considered. Related issues for islanding operation to improve service reliability for Thailand's Electric Power System are then analyzed.
Component Reliability Assessment of Offshore Jacket Platforms
V.J. Kurian
2015-01-01
Full Text Available Oil and gas industry is one of the most important industries contributing to the Malaysian economy. To extract hydrocarbons, various types of production platforms have been developed. Fixed jacket platform is the earliest type of production structure, widely installed in Malaysia’s shallow and intermediate waters. To date, more than 60% of these jacket platforms have operated exceeding their initial design life, thus making the re-evaluation and reassessment necessary for these platforms to continue to be put in service. In normal engineering practice, system reliability of a structure is evaluated as its safety parameter. This method is however, much complicated and time consuming. Assessing component's reliability can be an alternative approach to provide assurance about a structure’s condition in an early stage. Design codes such as the Working Stress Design (WSD and the Load and Resistance Factor Design (LRFD are well established for the component-level assessment. In reliability analysis, failure function, which consists of strength and load, is used to define the failure event. If the load acting exceeds the capacity of a structure, the structure will fail. Calculation of stress utilization ratio as given in the design codes is able to predict the reliability of a member and to estimate the extent to which a member is being utilised. The basic idea of this ratio is that if it is more than one, the member has failed and vice versa. Stress utilization ratio is a ratio of applied stress, which is the output reaction of environmental loadings acting on the structural member, to the design strength that comes from the member’s geometric and material properties. Adopting this ratio as the failure event, the reliability of each component is found. This study reviews and discusses the reliability for selected members of three Malaysian offshore jacket platforms. First Order Reliability Method (FORM was used to generate reliability index and
Validity and reliability of the Turkish version of the Essentials of Magnetism Scale (EOM II).
Yıldırım, D; Kısa, S; Hisar, F
2012-12-01
To test the validity and reliability of the Turkish version of the Essentials of Magnetism II Scale (EOMII) for use by staff nurses as being essential to quality patient care. This study consisted of 385 nurses from four joint commission internationally accredited hospitals. The EOMII scale was translated using a back-translation technique. The statistical analysis was carried out using Cronbach's alpha to test the internal consistency of the scale, while the factor analysis was carried out using the principal component analysis together with the varimax rotation and Kaiser normalization to test its construct validity. The total mean scores of all the items of the scale were found to be 155.33 (minimum 77 - maximum 219) and the standard deviation was 29.45. All the items showed a statistically significant correlation (P high level of reliability. Cronbach's alpha consistencies in subgroups were between 0.87 and 0.70. In this study, job satisfaction and quality results show the sign of convergence as in the original scale, which shows that the scale has a high construct validity (P < 0.01). Transcultural differences in the quality of nursing services can only be compared with reliable and valid instruments. This study shows that the Turkish version of the EOMII scale is a valid and reliable instrument to assess the nurses' working environment and to provide quality patient care in Turkey. © 2012 The Authors. International Nursing Review © 2012 International Council of Nurses.
Improving the Validity and Reliability of a Health Promotion Survey for Physical Therapists
Stephens, Jaca L.; Lowman, John D.; Graham, Cecilia L.; Morris, David M.; Kohler, Connie L.; Waugh, Jonathan B.
2013-01-01
Purpose Physical therapists (PTs) have a unique opportunity to intervene in the area of health promotion. However, no instrument has been validated to measure PTs’ views on health promotion in physical therapy practice. The purpose of this study was to evaluate the content validity and test-retest reliability of a health promotion survey designed for PTs. Methods An expert panel of PTs assessed the content validity of “The Role of Health Promotion in Physical Therapy Survey” and provided suggestions for revision. Item content validity was assessed using the content validity ratio (CVR) as well as the modified kappa statistic. Therapists then participated in the test-retest reliability assessment of the revised health promotion survey, which was assessed using a weighted kappa statistic. Results Based on feedback from the expert panelists, significant revisions were made to the original survey. The expert panel reached at least a majority consensus agreement for all items in the revised survey and the survey-CVR improved from 0.44 to 0.66. Only one item on the revised survey had substantial test-retest agreement, with 55% of the items having moderate agreement and 43% poor agreement. Conclusions All items on the revised health promotion survey demonstrated at least fair validity, but few items had reasonable test-retest reliability. Further modifications should be made to strengthen the validity and improve the reliability of this survey. PMID:23754935
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.
1998-07-01
The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.
Ultimately Reliable Pyrotechnic Systems
Scott, John H.; Hinkel, Todd
2015-01-01
This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing
Universal Grammar, statistics or both?
Yang, Charles D
2004-10-01
Recent demonstrations of statistical learning in infants have reinvigorated the innateness versus learning debate in language acquisition. This article addresses these issues from both computational and developmental perspectives. First, I argue that statistical learning using transitional probabilities cannot reliably segment words when scaled to a realistic setting (e.g. child-directed English). To be successful, it must be constrained by knowledge of phonological structure. Then, turning to the bona fide theory of innateness--the Principles and Parameters framework--I argue that a full explanation of children's grammar development must abandon the domain-specific learning model of triggering, in favor of probabilistic learning mechanisms that might be domain-general but nevertheless operate in the domain-specific space of syntactic parameters.
Ferrite logic reliability study
Baer, J. A.; Clark, C. B.
1973-01-01
Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)
Blade reliability collaborative :
Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.
2013-04-01
The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.
More Fact Sheets - SEER Cancer Statistics
Cancer Statistical Fact Sheets are summaries of common cancer types developed to provide an overview of frequently-requested cancer statistics including incidence, mortality, survival, stage, prevalence, and lifetime risk.
Using statistics to understand the environment
Cook, Penny A
2005-01-01
Featuring worked examples covering a wide range of environmental topics, drawings and icons, chapter summaries, a glossary of statistical terms and a further reading section, this book provides an invaluable student friendly introduction to using statistics to understand the environment.
More Fact Sheets - SEER Cancer Statistics
Cancer Statistical Fact Sheets are summaries of common cancer types developed to provide an overview of frequently-requested cancer statistics including incidence, mortality, survival, stage, prevalence, and lifetime risk.
Statistical aspects of determinantal point processes
Lavancier, Frédéric; Møller, Jesper; Rubak, Ege
The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...
Automatic Statistics Extraction for Amateur Soccer Videos
van Gemert, J.C.; Schavemaker, J.G.M.; Bonenkamp, K.; Spink, A.J.; Loijens, L.W.S.; Woloszynowska-Fraser, M.; Noldus, L.P.J.J.
2014-01-01
Amateur soccer statistics have interesting applications such as providing insights to improve team performance, individual coaching, monitoring team progress and personal or team entertainment. Professional soccer statistics are extracted with labor intensive expensive manual effort which is not rea
Automatic Statistics Extraction for Amateur Soccer Videos
Gemert, J.C. van; Schavemaker, J.G.M.; Bonenkamp, C.W.B.
2014-01-01
Amateur soccer statistics have interesting applications such as providing insights to improve team performance, individual coaching, monitoring team progress and personal or team entertainment. Professional soccer statistics are extracted with labor intensive expensive manual effort which is not rea
Design for Reliability of Power Electronic Systems
Wang, Huai; Ma, Ke; Blaabjerg, Frede
2012-01-01
availability, long lifetime, sufficient robustness, low maintenance cost and low cost of energy. However, the reliability predictions are still dominantly according to outdated models and terms, such as MIL-HDBK-217F handbook models, Mean-Time-To-Failure (MTTF), and Mean-Time-Between-Failures (MTBF......Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...
Statistical Methods for Integrating Multiple Types of High-Throughput Data
Xie, Yang; Ahn, Chul
2010-01-01
Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integratin...
The Geometry of Statistical Efficiency and Matrix Statistics
K. Gustafson
2007-01-01
Full Text Available We will place certain parts of the theory of statistical efficiency into the author's operator trigonometry (1967, thereby providing new geometrical understanding of statistical efficiency. Important earlier results of Bloomfield and Watson, Durbin and Kendall, Rao and Rao, will be so interpreted. For example, worse case relative least squares efficiency corresponds to and is achieved by the maximal turning antieigenvectors of the covariance matrix. Some little-known historical perspectives will also be exposed. The overall view will be emphasized.
Statistical inference for financial engineering
Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki
2014-01-01
This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.
Straightforward statistics understanding the tools of research
Geher, Glenn
2014-01-01
Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Swarm of bees and particles algorithms in the problem of gradual failure reliability assurance
M. F. Anop
2015-01-01
Full Text Available Probability-statistical framework of reliability theory uses models based on the chance failures analysis. These models are not functional and do not reflect relation of reliability characteristics to the object performance. At the same time, a significant part of the technical systems failures are gradual failures caused by degradation of the internal parameters of the system under the influence of various external factors.The paper shows how to provide the required level of reliability at the design stage using a functional model of a technical object. Paper describes the method for solving this problem under incomplete initial information, when there is no information about the patterns of technological deviations and degradation parameters, and the considered system model is a \\black box" one.To this end, we formulate the problem of optimal parametric synthesis. It lies in the choice of the nominal values of the system parameters to satisfy the requirements for its operation and take into account the unavoidable deviations of the parameters from their design values during operation. As an optimization criterion in this case we propose to use a deterministic geometric criterion \\reliability reserve", which is the minimum distance measured along the coordinate directions from the nominal parameter value to the acceptability region boundary rather than statistical values.The paper presents the results of the application of heuristic swarm intelligence methods to solve the formulated optimization problem. Efficiency of particle swarm algorithms and swarm of bees one compared with undirected random search algorithm in solving a number of test optimal parametric synthesis problems in three areas: reliability, convergence rate and operating time. The study suggests that the use of a swarm of bees method for solving the problem of the technical systems gradual failure reliability ensuring is preferred because of the greater flexibility of the
Load Control System Reliability
Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)
2015-04-03
This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”
Supply chain reliability modelling
Eugen Zaitsev
2012-03-01
Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Discrete Reliability Projection
2014-12-01
trials and that failures from individual failure modes are statistically independent and binomially distributed . The probability of success...such that p ∗ n is constant = λ, the binomial distribution approximates the Poisson distribution with parameter λ. 4.1.2 Unsurfaced Failure Modes Given...line (dark blue) is the result of numerical integration of beta and binomial distributions . It shows a theoretical prediction of the outcome of the 27
Official statistics and Big Data
Peter Struijs
2014-07-01
Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.
OSS reliability measurement and assessment
Yamada, Shigeru
2016-01-01
This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.
Reliability and validity in research.
Roberts, Paula; Priest, Helena
This article examines reliability and validity as ways to demonstrate the rigour and trustworthiness of quantitative and qualitative research. The authors discuss the basic principles of reliability and validity for readers who are new to research.
Jung, Hannes; /DESY; De Roeck, Albert; /CERN; Bartels, Jochen; /Hamburg U., Inst. Theor. Phys. II; Behnke, Olaf; Blumlein, Johannes; /DESY; Brodsky, Stanley; /SLAC /Durham U., IPPP; Cooper-Sarkar, Amanda; /Oxford U.; Deak, Michal; /DESY; Devenish, Robin; /Oxford U.; Diehl, Markus; /DESY; Gehrmann, Thomas; /Zurich U.; Grindhammer, Guenter; /Munich, Max Planck Inst.; Gustafson, Gosta; /CERN /Lund U., Dept. Theor. Phys.; Khoze, Valery; /Durham U., IPPP; Knutsson, Albert; /DESY; Klein, Max; /Liverpool U.; Krauss, Frank; /Durham U., IPPP; Kutak, Krzysztof; /DESY; Laenen, Eric; /NIKHEF, Amsterdam; Lonnblad, Leif; /Lund U., Dept. Theor. Phys.; Motyka, Leszek; /Hamburg U., Inst. Theor. Phys. II /Birmingham U. /Southern Methodist U. /DESY /Piemonte Orientale U., Novara /CERN /Paris, LPTHE /Hamburg U. /Penn State U.
2011-11-10
More than 100 people participated in a discussion session at the DIS08 workshop on the topic What HERA may provide. A summary of the discussion with a structured outlook and list of desirable measurements and theory calculations is given. The HERA accelerator and the HERA experiments H1, HERMES and ZEUS stopped running in the end of June 2007. This was after 15 years of very successful operation since the first collisions in 1992. A total luminosity of {approx} 500 pb{sup -1} has been accumulated by each of the collider experiments H1 and ZEUS. During the years the increasingly better understood and upgraded detectors and HERA accelerator have contributed significantly to this success. The physics program remains in full swing and plenty of new results were presented at DIS08 which are approaching the anticipated final precision, fulfilling and exceeding the physics plans and the previsions of the upgrade program. Most of the analyses presented at DIS08 were still based on the so called HERA I data sample, i.e. data taken until 2000, before the shutdown for the luminosity upgrade. This sample has an integrated luminosity of {approx} 100 pb{sup -1}, and the four times larger statistics sample from HERA II is still in the process of being analyzed.
Fractional statistics and quantum theory
Khare, Avinash
1997-01-01
This book explains the subtleties of quantum statistical mechanics in lower dimensions and their possible ramifications in quantum theory. The discussion is at a pedagogical level and is addressed to both graduate students and advanced research workers with a reasonable background in quantum and statistical mechanics. The main emphasis will be on explaining new concepts. Topics in the first part of the book includes the flux tube model of anyons, the braid group and quantum and statistical mechanics of noninteracting anyon gas. The second part of the book provides a detailed discussion about f
Statistical theory of signal detection
Helstrom, Carl Wilhelm; Costrell, L; Kandiah, K
1968-01-01
Statistical Theory of Signal Detection, Second Edition provides an elementary introduction to the theory of statistical testing of hypotheses that is related to the detection of signals in radar and communications technology. This book presents a comprehensive survey of digital communication systems. Organized into 11 chapters, this edition begins with an overview of the theory of signal detection and the typical detection problem. This text then examines the goals of the detection system, which are defined through an analogy with the testing of statistical hypotheses. Other chapters consider
Statistical Analysis with Missing Data
Little, Roderick J A
2002-01-01
Praise for the First Edition of Statistical Analysis with Missing Data ""An important contribution to the applied statistics literature.... I give the book high marks for unifying and making accessible much of the past and current work in this important area."" Ã¢ÂÂWilliam E. Strawderman, Rutgers University ""This book...provide[s] interesting real-life examples, stimulating end-of-chapter exercises, and up-to-date references. It should be on every applied statisticianÃ¢ÂÂs bookshelf."" Ã¢ÂÂThe Statistician ""The book should be studied in the statistical methods d
Statistical Aspects of Information Integration
2009-12-17
measurements at multiple measurement sites provide potentially valuable predictive information on functional disabilities in epilepsy patients...among the top ten sellers for Springer , the largest and one of the most prominent publishers in statistics. Publications Books Kolaczyk, li.O. (2009...Statistical Analysis of Network Data: Methods and Models. New York, Springer . Rcfcrced Journal Articles Nanai, N., Kolaczyk, E.D., and Kasif, S
Personalized visualization of blog statistics
Durmén Blunt, Tina
2013-01-01
This report documents the research, implementation and result for a master thesis in Media Technology and Engineering at Linkoping University. The aim of the project was to develop a personalized visualization application of blog statistics to be implemented on a web based community for blog authors. The purpose of the application is to provide the users with a tool to explore statistics connected to their own blog. Based on a literature study in usability and information visualization the ap...
Purohit, Sudha G; Deshmukh, Shailaja R
2015-01-01
STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.
Fault recovery in the reliable multicast protocol
Callahan, John R.; Montgomery, Todd L.; Whetten, Brian
1995-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Reliability analysis and prediction of mixed mode load using Markov Chain Model
Nikabdullah, N.; Singh, S. S. K.; Alebrahim, R.; Azizi, M. A.; K, Elwaleed A.; Noorani, M. S. M.
2014-06-01
The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.
Reliability analysis and prediction of mixed mode load using Markov Chain Model
Nikabdullah, N. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia and Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Singh, S. S. K.; Alebrahim, R.; Azizi, M. A. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); K, Elwaleed A. [Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Noorani, M. S. M. [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia (Malaysia)
2014-06-19
The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.
Development and interrater reliability of the UK Mental Health Triage Scale.
Sands, Natisha; Elsom, Stephen; Colgate, Robert; Haylor, Helen; Prematunga, Roshani
2016-08-01
Mental health triage scales are clinical tools used at point of entry to specialist mental health service to provide a systematic way of categorizing the urgency of clinical presentations, and determining an appropriate service response and an optimal timeframe for intervention. The aim of the present study was to test the interrater reliability of a mental health triage scale developed for use in UK mental health triage and crisis services. An interrater reliability study was undertaken. Triage clinicians from England and Wales (n = 66) used the UK Mental Health Triage Scale (UK MHTS) to rate the urgency of 21 validated mental health triage scenarios derived from real occasions of triage. Interrater reliability was calculated using Kendall's coefficient of concordance (w) and intraclass correlation coefficient (ICC) statistics. The average ICC was 0.997 (95% confidence interval (CI): 0.996-0.999 (F (20, 1300) = 394.762, P < 0.001). The single measure ICC was 0.856 (95% CI: 0.776-0.926 (F (20, 1300) = 394.762, P < 0.001). The overall Kendall's w was 0.88 (P < 0.001). The UK MHTS shows substantial levels of interrater reliability. Reliable mental health triage scales employed within effective mental health triage systems offer possibilities for not only improved patient outcomes and experiences, but also for efficient use of finite specialist mental health services.
Basic statistics an introduction with R
Raykov, Tenko
2012-01-01
Basic Statistics provides an accessible and comprehensive introduction to statistics using the free, state-of-the-art, powerful software program R. This book is designed to both introduce students to key concepts in statistics and to provide simple instructions for using R.Teaches essential concepts in statistics, assuming little background knowledge on the part of the readerIntroduces students to R with as few sub-commands as possible for ease of useProvides practical examples from the educational, behavioral, and social sciencesBasic Statistics will appeal to students and professionals acros
2017 NREL Photovoltaic Reliability Workshop
Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-08-15
NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.
Testing for PV Reliability (Presentation)
Kurtz, S.; Bansal, S.
2014-09-01
The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.
LMI approach to reliable H∞ control of linear systems
Yao Bo; Wang Fuzhong
2006-01-01
The reliable design problem for linear systems is concerned with. A more practical model of actuator faults than outage is considered. An LMI approach of designing reliable controller is presented for the case of actuator faults that can be modeled by a scaling factor. The resulting control systems are reliable in that they provide guaranteed asymptotic stability and H∞ performance when some control component (actuator) faults occur. A numerical example is also given to illustrate the design procedure and their effectiveness. Furthermore, the optimal standard controller and the optimal reliable controller are compared to show the necessity of reliable control.
Rapid Serial Auditory Presentation: A New Measure of Statistical Learning in Speech Segmentation.
Franco, Ana; Eberlen, Julia; Destrebecqz, Arnaud; Cleeremans, Axel; Bertels, Julie
2015-01-01
The Rapid Serial Visual Presentation procedure is a method widely used in visual perception research. In this paper we propose an adaptation of this method which can be used with auditory material and enables assessment of statistical learning in speech segmentation. Adult participants were exposed to an artificial speech stream composed of statistically defined trisyllabic nonsense words. They were subsequently instructed to perform a detection task in a Rapid Serial Auditory Presentation (RSAP) stream in which they had to detect a syllable in a short speech stream. Results showed that reaction times varied as a function of the statistical predictability of the syllable: second and third syllables of each word were responded to faster than first syllables. This result suggests that the RSAP procedure provides a reliable and sensitive indirect measure of auditory statistical learning.
Network-based statistical comparison of citation topology of bibliographic databases
Šubelj, Lovro; Bajec, Marko
2015-01-01
Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, w...
The Attenuation of Correlation Coefficients: A Statistical Literacy Issue
Trafimow, David
2016-01-01
Much of the science reported in the media depends on correlation coefficients. But the size of correlation coefficients depends, in part, on the reliability with which the correlated variables are measured. Understanding this is a statistical literacy issue.
Statistical Survey of Non-Formal Education
Ondřej Nývlt
2012-12-01
Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.
Energy Statistics Manual; Manual Statistik Energi
NONE
2005-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
About Statistical Analysis of Qualitative Survey Data
Stefan Loehnert
2010-01-01
Full Text Available Gathered data is frequently not in a numerical form allowing immediate appliance of the quantitative mathematical-statistical methods. In this paper are some basic aspects examining how quantitative-based statistical methodology can be utilized in the analysis of qualitative data sets. The transformation of qualitative data into numeric values is considered as the entrance point to quantitative analysis. Concurrently related publications and impacts of scale transformations are discussed. Subsequently, it is shown how correlation coefficients are usable in conjunction with data aggregation constrains to construct relationship modelling matrices. For illustration, a case study is referenced at which ordinal type ordered qualitative survey answers are allocated to process defining procedures as aggregation levels. Finally options about measuring the adherence of the gathered empirical data to such kind of derived aggregation models are introduced and a statistically based reliability check approach to evaluate the reliability of the chosen model specification is outlined.
Energy Statistics Manual; Enerji Istatistikleri El Kitabi
NONE
2004-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Energy Statistics Manual; Manual de Estadisticas Energeticas
NONE
2007-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Official Statistics In a Modern Society
Ilie DUMITRESCU
2010-10-01
Full Text Available The modern democratic society cannot efficiently and rigorously function in the absence of a solid basis of relevant and reliable statistical data, allowing for an easy and user-friendly access. Representing a “public good”, in the contemporary society, the official statistical information is meant to serve the whole society, under the conditions of maximum transparency, impartiality and equal treatment of all the categories of data users.Official statistics should adapt itself to the changes taking place in the modern society and should comply with its increased demands for high quality information. On its turn, it imposes to both national and global statistical systems major tasks of structural changes in the activity of producing and disseminating official statistics, as well as in the communication with its partners from the informational fl ow upstream, but particularly from its downstream – these being the target recipients of statistical data.The article presents the vision on the official statistics role, functions and tasks in the modern society, as against the major challenges regarding the transformation of statistical information into knowledge, the promotion of statistical literacy and culture, ensuring the usefulness and the large scale use of statistical data.
Graham, John W.; And Others
1984-01-01
Describes an evaluation of a self-report questionnaire administered to seventh graders (N=396). Using the test-retest reliability matrix, eight of nine drug-use indices appeared to have acceptable to good reliability. The three measures included in the test-retest reliability matrix provide stronger evidence for good reliability than could any…
The challenge of reliability in MEMS commercialization
Miller, W.M.; Tanner, D.M.; Miller, S.L.
1998-09-01
MicroElectroMechanical Systems (MEMS) that think, sense, act and communicate will open up a broad new array of cost-effective solutions only if MEMS is demonstrated to be sufficiently reliable. This could prove to be a major challenge if it is not addressed concurrently with technology development. There are three requirements for a valid assessment of reliability: statistical significance, identification of fundamental failure mechanisms and development of techniques for accelerating them, and valid physical models to allow prediction of failures during actual use. While these already exist for the microelectronics portion of such integrated systems, the real challenge lies in the less well-understood micromachine portions and its synergistic effects with microelectronics. This requires the elicitation of a methodology focused on MEMS reliability, which the authors discuss. A new testing and analysis infrastructure must also be developed to meet the needs of this methodology. They describe their implementation of this infrastructure and its success in addressing the three requirements for a valid reliability assessment.
Statistical methods for forecasting
Abraham, Bovas
2009-01-01
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...
International petroleum statistics report
NONE
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Elements of Statistical Mechanics
Sachs, Ivo; Sen, Siddhartha; Sexton, James
2006-05-01
This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics
贾怀勤
2003-01-01
While introducing the core contents of Manual on statistics of International Trade in Services , the paper comments on the important methodology used by the authors of the Manual, which sets up a dual frame covering both of the trade-in-services between residence and non-residence and that provided by foreign affiliates.
Hybrid reliability model for fatigue reliability analysis of steel bridges
曹珊珊; 雷俊卿
2016-01-01
A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.
Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions
Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier
We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...... turnover and statistically superior positions compared to existing procedures. Translating these statistical improvements into economic gains, we find that under empirically realistic assumptions a risk-averse investor would be willing to pay up to 170 basis points per year to shift to using the new class...
Synthesis of Reliable Telecommunication Networks
Dusan Trstensky
2005-01-01
Full Text Available In many application, the network designer may to know to senthesise a reliable telecommunication network. Assume that a network, denoted Gm,e has the number of nodes n and the number of edges e, and the operational probability of each edge is known. The system reliability of the network is defined to be the reliability that every pair of nodes can communicate with each other. A network synthesis problem considered in this paper is to find a network G*n,e, that maximises system reliability over the class of all networks for the classes of networks Gn,n-1, Gn,m and Gn,n+1 respectively. In addition an upper bound of maximum reliability for the networks with n-node and e-edge (e>n+2 is derived in terms of node. Computational experiments for the reliability upper are also presented. the results show, that the proposed reliability upper bound is effective.
Mathematical reliability an expository perspective
Mazzuchi, Thomas; Singpurwalla, Nozer
2004-01-01
In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...
Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck
Jones, Harry W.
2015-01-01
NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.
Statistical mechanics of complex networks
Rubi, Miguel; Diaz-Guilera, Albert
2003-01-01
Networks can provide a useful model and graphic image useful for the description of a wide variety of web-like structures in the physical and man-made realms, e.g. protein networks, food webs and the Internet. The contributions gathered in the present volume provide both an introduction to, and an overview of, the multifaceted phenomenology of complex networks. Statistical Mechanics of Complex Networks also provides a state-of-the-art picture of current theoretical methods and approaches.
Statistical physics and ecology
Volkov, Igor
This work addresses the applications of the methods of statistical physics to problems in population ecology. A theoretical framework based on stochastic Markov processes for the unified neutral theory of biodiversity is presented and an analytical solution for the distribution of the relative species abundance distribution both in the large meta-community and in the small local community is obtained. It is shown that the framework of the current neutral theory in ecology can be easily generalized to incorporate symmetric density dependence. An analytically tractable model is studied that provides an accurate description of beta-diversity and exhibits novel scaling behavior that leads to links between ecological measures such as relative species abundance and the species area relationship. We develop a simple framework that incorporates the Janzen-Connell, dispersal and immigration effects and leads to a description of the distribution of relative species abundance, the equilibrium species richness, beta-diversity and the species area relationship, in good accord with data. Also it is shown that an ecosystem can be mapped into an unconventional statistical ensemble and is quite generally tuned in the vicinity of a phase transition where bio-diversity and the use of resources are optimized. We also perform a detailed study of the unconventional statistical ensemble, in which, unlike in physics, the total number of particles and the energy are not fixed but bounded. We show that the temperature and the chemical potential play a dual role: they determine the average energy and the population of the levels in the system and at the same time they act as an imbalance between the energy and population ceilings and the corresponding average values. Different types of statistics (Boltzmann, Bose-Einstein, Fermi-Dirac and one corresponding to the description of a simple ecosystem) are considered. In all cases, we show that the systems may undergo a first or a second order
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Structural Reliability Sensitivities under Nonstationary Random Vibrations
Rita Greco
2013-01-01
Full Text Available Response sensitivity evaluation is an important element in reliability evaluation and design optimization of structural systems. It has been widely studied under static and dynamic forcing conditions with deterministic input data. In this paper, structural response and reliability sensitivities are determined by means of the time domain covariance analysis in both classically and nonclassically damped linear structural systems. A time integration scheme is proposed for covariance sensitivity. A modulated, filtered, white noise input process is adopted to model the stochastic nonstationary loads. The method allows for the evaluation of sensitivity statistics of different quantities of dynamic response with respect to structural parameters. Finally, numerical examples are presented regarding a multistorey shear frame building.
Oscillations in counting statistics
Wilk, Grzegorz
2016-01-01
The very large transverse momenta and large multiplicities available in present LHC experiments on pp collisions allow a much closer look at the corresponding distributions. Some time ago we discussed a possible physical meaning of apparent log-periodic oscillations showing up in p_T distributions (suggesting that the exponent of the observed power-like behavior is complex). In this talk we concentrate on another example of oscillations, this time connected with multiplicity distributions P(N). We argue that some combinations of the experimentally measured values of P(N) (satisfying the recurrence relations used in the description of cascade-stochastic processes in quantum optics) exhibit distinct oscillatory behavior, not observed in the usual Negative Binomial Distributions used to fit data. These oscillations provide yet another example of oscillations seen in counting statistics in many different, apparently very disparate branches of physics further demonstrating the universality of this phenomenon.
Analysis on Operation Reliability of Generating Units in 2009
Zhou
2010-01-01
This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009. The units brought into the statistical analysis include 100-MW or above thermal generating units, 40-MW or above hydro generating units, and all nuclear generating units. The reliability indices embodied include utilization hours, times and hours of scheduled outages, times and hours of unscheduled outages, equivalent forced outage rate and equivalent availability factor.
Identifying Useful Statistical Indicators of Proximity to Instability in Stochastic Power Systems
Ghanavati, Goodarz; Lakoba, Taras I
2014-01-01
Prior research has shown that autocorrelation and variance in voltage measurements tend to increase as power systems approach instability. This paper seeks to identify the conditions under which these statistical indicators provide reliable early warning of instability in power systems. First, the paper derives and validates a semi-analytical method for quickly calculating the expected variance and autocorrelation of all voltages and currents in an arbitrary power system model. Building on this approach, the paper describes the conditions under which filtering can be used to detect these signs in the presence of measurement noise. Finally, several experiments show which types of measurements are good indicators of proximity to instability for particular types of state changes. For example, increased variance in voltages can reliably indicate the location of increased stress, while growth of autocorrelation in certain line currents is a reliable indicator of system-wide instability.
Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues
Ronald Laurids Boring
2010-11-01
This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.
Statistics for scientists and engineers
Shanmugam , Ramalingam
2015-01-01
This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob
Research and development statistics 2001
2002-01-01
This publication provides recent basic statistics on the resources devoted to R&D in OECD countries. The statistical series are presented for the last seven years for which data are available and cover expenditure by source of funds and type of costs; personnel by occupation and/or level of qualification; both at the national level by performance sector, for enterprises by industry, and for higher education by field of science. The publication also provides information on the output of science and technology (S&T) activities relating to the technology balance of payments.
Beyond the Numbers Making Sense of Statistics
Christmann, Edwin
2011-01-01
Statistics is required coursework within most teacher certification programs. Beyond the Numbers presents a nonthreatening, practical approach to statistics, providing step-by-step instructions for understanding and implementing the essential components of the subject.The basic and understandable explanations in Beyond the Numbers break down complex statistical processes to simple arithmetic computations that can be applied with the confidence that accompanies understanding.
Studying Reliability Using Identical Handheld Lactate Analyzers
Stewart, Mark T.; Stavrianeas, Stasinos
2008-01-01
Accusport analyzers were used to generate lactate performance curves in an investigative laboratory activity emphasizing the importance of reliable instrumentation. Both the calibration and testing phases of the exercise provided students with a hands-on opportunity to use laboratory-grade instrumentation while allowing for meaningful connections…