WorldWideScience

Sample records for accident prediction models

  1. ACCIDENT PREDICTION MODELS FOR UNSIGNALISED URBAN JUNCTIONS IN GHANA

    OpenAIRE

    Mohammed SALIFU, MSc., PhD, MIHT, MGhIE

    2004-01-01

    The main objective of this study was to provide an improved method for safety appraisal in Ghana through the development and application of suitable accident prediction models for unsignalised urban junctions. A case study was designed comprising 91 junctions selected from the two most cosmopolitan cities in Ghana. A wide range of traffic and road data together with the corresponding accident data for each junction for the three-year period 1996-1998 was utilized in the model development p...

  2. Accident prediction model for public highway-rail grade crossings.

    Science.gov (United States)

    Lu, Pan; Tolliver, Denver

    2016-05-01

    Considerable research has focused on roadway accident frequency analysis, but relatively little research has examined safety evaluation at highway-rail grade crossings. Highway-rail grade crossings are critical spatial locations of utmost importance for transportation safety because traffic crashes at highway-rail grade crossings are often catastrophic with serious consequences. The Poisson regression model has been employed to analyze vehicle accident frequency as a good starting point for many years. The most commonly applied variations of Poisson including negative binomial, and zero-inflated Poisson. These models are used to deal with common crash data issues such as over-dispersion (sample variance is larger than the sample mean) and preponderance of zeros (low sample mean and small sample size). On rare occasions traffic crash data have been shown to be under-dispersed (sample variance is smaller than the sample mean) and traditional distributions such as Poisson or negative binomial cannot handle under-dispersion well. The objective of this study is to investigate and compare various alternate highway-rail grade crossing accident frequency models that can handle the under-dispersion issue. The contributions of the paper are two-fold: (1) application of probability models to deal with under-dispersion issues and (2) obtain insights regarding to vehicle crashes at public highway-rail grade crossings.

  3. An Artificial Neural Network Model for Highway Accident Prediction: A Case Study of Erzurum, Turkey

    Directory of Open Access Journals (Sweden)

    Muhammed Yasin Çodur

    2015-06-01

    Full Text Available This study presents an accident prediction model of Erzurum’s Highways in Turkey using artificial neural network (ANN approaches. There are many ANN models for predicting the number of accidents on highways that were developed using 8 years with 7,780 complete accident reports of historical data (2005-2012. The best ANN model was chosen for this task and the model parameters included years, highway sections, section length (km, annual average daily traffic (AADT, the degree of horizontal curvature, the degree of vertical curvature, traffic accidents with heavy vehicles (percentage, and traffic accidents that occurred in summer (percentage. In the ANN model development, the sigmoid activation function was employed with Levenberg-Marquardt algorithm. The performance of the developed ANN model was evaluated by mean square error (MSE, the root mean square error (RMSE, and the coefficient of determination (R2. The model results indicate that the degree of vertical curvature is the most important parameter that affects the number of accidents on highways.

  4. Predictive model for motorcycle accidents at three-legged priority junctions.

    Science.gov (United States)

    Harnen, S; Umar, R S Radin; Wong, S V; Wan Hashim, W I

    2003-12-01

    In conjunction with a nationwide motorcycle safety program, the provision of exclusive motorcycle lanes has been implemented to overcome link-motorcycle accidents along trunk roads in Malaysia. However, not much work has been done to address accidents at junctions involving motorcycles. This article presents the development of predictive model for motorcycle accidents at three-legged major-minor priority junctions of urban roads in Malaysia. The generalized linear modeling technique was used to develop the model. The final model reveals that motorcycle accidents are proportional to the power of traffic flow. An increase in nonmotorcycle and motorcycle flows entering the junctions is associated with an increase in motorcycle accidents. Nonmotorcycle flow on major roads had the highest effect on the probability of motorcycle accidents. Approach speed, lane width, number of lanes, shoulder width, and land use were found to be significant in explaining motorcycle accidents at the three-legged major-minor priority junctions. These findings should enable traffic engineers to specifically design appropriate junction treatment criteria for nonexclusive motorcycle lane facilities.

  5. Combined Prediction Model of Death Toll for Road Traffic Accidents Based on Independent and Dependent Variables

    Directory of Open Access Journals (Sweden)

    Feng Zhong-xiang

    2014-01-01

    Full Text Available In order to build a combined model which can meet the variation rule of death toll data for road traffic accidents and can reflect the influence of multiple factors on traffic accidents and improve prediction accuracy for accidents, the Verhulst model was built based on the number of death tolls for road traffic accidents in China from 2002 to 2011; and car ownership, population, GDP, highway freight volume, highway passenger transportation volume, and highway mileage were chosen as the factors to build the death toll multivariate linear regression model. Then the two models were combined to be a combined prediction model which has weight coefficient. Shapley value method was applied to calculate the weight coefficient by assessing contributions. Finally, the combined model was used to recalculate the number of death tolls from 2002 to 2011, and the combined model was compared with the Verhulst and multivariate linear regression models. The results showed that the new model could not only characterize the death toll data characteristics but also quantify the degree of influence to the death toll by each influencing factor and had high accuracy as well as strong practicability.

  6. Prevaba: a Bayesian Model to Predict the Existence of Victims in car accidents

    Directory of Open Access Journals (Sweden)

    TELLES, M.J.

    2015-12-01

    Full Text Available Road safety is an area which is concerned with both the reduction of accidents as with the care provided to the victims. Several initiatives are proposed to assist with reducing the number of accidents, such as surveillance, awareness campaigns and support equipment to drivers. Other initiatives for prevention and protection are proposed by vehicle manufacturers in terms of requirements of governament entities. As a final resort, that is, in the event of the accident and the victim needs medical attention, this should be done as quickly as possible. To assist in identifying the existence of the victim and the need for medical care, we propose a Bayesian model, called Prevaba, which uses Bayesian Networks (BN, which aims to predict the existence of victims in traffic accidents. In order to validate the model, we developed a prototype that performed the actual data classification in Porto Alegre - RS for the year 2013. The prototype made the classification based on the previous year's data (2012, showing an index above 90% accuracy, taking into account the incorrect classications are only classified as victimless, but actually was has a victim.

  7. Application of Gray Markov SCGM(1,1) c Model to Prediction of Accidents Deaths in Coal Mining.

    Science.gov (United States)

    Lan, Jian-Yi; Zhou, Ying

    2014-01-01

    The prediction of mine accident is the basis of aviation safety assessment and decision making. Gray prediction is suitable for such kinds of system objects with few data, short time, and little fluctuation, and Markov chain theory is just suitable for forecasting stochastic fluctuating dynamic process. Analyzing the coal mine accident human error cause, combining the advantages of both Gray prediction and Markov theory, an amended Gray Markov SCGM(1,1) c model is proposed. The gray SCGM(1,1) c model is applied to imitate the development tendency of the mine safety accident, and adopt the amended model to improve prediction accuracy, while Markov prediction is used to predict the fluctuation along the tendency. Finally, the new model is applied to forecast the mine safety accident deaths from 1990 to 2010 in China, and, 2011-2014 coal accidents deaths were predicted. The results show that the new model not only discovers the trend of the mine human error accident death toll but also overcomes the random fluctuation of data affecting precision. It possesses stronger engineering application.

  8. A combined M5P tree and hazard-based duration model for predicting urban freeway traffic accident durations.

    Science.gov (United States)

    Lin, Lei; Wang, Qian; Sadek, Adel W

    2016-06-01

    The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a "time-to-an-event" is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a "time-to-event" modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean

  9. Predicting road accidents: Structural time series approach

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  10. Explaining and predicting workplace accidents using data-mining techniques

    Energy Technology Data Exchange (ETDEWEB)

    Rivas, T., E-mail: trivas@uvigo.e [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Paz, M., E-mail: mpaz.minas@gmail.co [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Martin, J.E., E-mail: jmartin@cippinternacional.co [CIPP International, S.L. Parque Tecnologico de Asturias, Parcela 43, Oficina 11, 33428 Llanera (Spain); Matias, J.M., E-mail: jmmatias@uvigo.e [Dpto. Estadistica e Investigacion Operativa, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Garcia, J.F., E-mail: jgarcia@cippinternacional.co [CIPP International, S.L. Parque Tecnologico de Asturias, Parcela 43, Oficina 11, 33428 Llanera (Spain); Taboada, J., E-mail: jtaboada@uvigo.e [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain)

    2011-07-15

    Current research into workplace risk is mainly conducted using conventional descriptive statistics, which, however, fail to properly identify cause-effect relationships and are unable to construct models that could predict accidents. The authors of the present study modelled incidents and accidents in two companies in the mining and construction sectors in order to identify the most important causes of accidents and develop predictive models. Data-mining techniques (decision rules, Bayesian networks, support vector machines and classification trees) were used to model accident and incident data compiled from the mining and construction sectors and obtained in interviews conducted soon after an incident/accident occurred. The results were compared with those for a classical statistical techniques (logistic regression), revealing the superiority of decision rules, classification trees and Bayesian networks in predicting and identifying the factors underlying accidents/incidents.

  11. Review the number of accidents in Tehran over a two-year period and prediction of the number of events based on a time-series model

    Science.gov (United States)

    Teymuri, Ghulam Heidar; Sadeghian, Marzieh; Kangavari, Mehdi; Asghari, Mehdi; Madrese, Elham; Abbasinia, Marzieh; Ahmadnezhad, Iman; Gholizadeh, Yavar

    2013-01-01

    Background: One of the significant dangers that threaten people’s lives is the increased risk of accidents. Annually, more than 1.3 million people die around the world as a result of accidents, and it has been estimated that approximately 300 deaths occur daily due to traffic accidents in the world with more than 50% of that number being people who were not even passengers in the cars. The aim of this study was to examine traffic accidents in Tehran and forecast the number of future accidents using a time-series model. Methods: The study was a cross-sectional study that was conducted in 2011. The sample population was all traffic accidents that caused death and physical injuries in Tehran in 2010 and 2011, as registered in the Tehran Emergency ward. The present study used Minitab 15 software to provide a description of accidents in Tehran for the specified time period as well as those that occurred during April 2012. Results: The results indicated that the average number of daily traffic accidents in Tehran in 2010 was 187 with a standard deviation of 83.6. In 2011, there was an average of 180 daily traffic accidents with a standard deviation of 39.5. One-way analysis of variance indicated that the average number of accidents in the city was different for different months of the year (P < 0.05). Most of the accidents occurred in March, July, August, and September. Thus, more accidents occurred in the summer than in the other seasons. The number of accidents was predicted based on an auto-regressive, moving average (ARMA) for April 2012. The number of accidents displayed a seasonal trend. The prediction of the number of accidents in the city during April of 2012 indicated that a total of 4,459 accidents would occur with mean of 149 accidents per day during these three months. Conclusion: The number of accidents in Tehran displayed a seasonal trend, and the number of accidents was different for different seasons of the year. PMID:26120405

  12. A dynamic food-chain model and program for predicting the consequences of nuclear accident

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A dynamic food-chain model and program, DYFOM-95, forpredicting the radiological consequences of nuclear accident hasbeen developed, which is not only suitable to the West food-chainbut also to Chinese food chain. The following processes, caused byaccident release which will make an impact on radionuclideconcentration in the edible parts of vegetable are considered: dryand wet deposition interception and initial retention,translocation, percolation, root uptake and tillage. Activityintake rate of animals, effects of processing and activity intakeof human through ingestion pathway are also considered incalculations. The effects of leaf area index LAI of vegetable areconsidered in dry deposition model. A method for calculating thecontribution of rain with different period and different intensityto total wet deposition is established. The program contains 1 maincode and 5 sub-codes to calculate dry and wet deposition on surfaceof vegetable and soil, translocation of nuclides in vegetable,nuclide concentration in the edible parts of vegetable and inanimal products and activity intake of human and so on.

  13. Predicting Severity and Duration of Road Traffic Accident

    Directory of Open Access Journals (Sweden)

    Fang Zong

    2013-01-01

    Full Text Available This paper presents a model system to predict severity and duration of traffic accidents by employing Ordered Probit model and Hazard model, respectively. The models are estimated using traffic accident data collected in Jilin province, China, in 2010. With the developed models, three severity indicators, namely, number of fatalities, number of injuries, and property damage, as well as accident duration, are predicted, and the important influences of related variables are identified. The results indicate that the goodness-of-fit of Ordered Probit model is higher than that of SVC model in severity modeling. In addition, accident severity is proven to be an important determinant of duration; that is, more fatalities and injuries in the accident lead to longer duration. Study results can be applied to predictions of accident severity and duration, which are two essential steps in accident management process. By recognizing those key influences, this study also provides suggestive results for government to take effective measures to reduce accident impacts and improve traffic safety.

  14. Correspondence model of occupational accidents

    Directory of Open Access Journals (Sweden)

    Juan C. Conte

    2011-09-01

    Full Text Available We present a new generalized model for the diagnosis and prediction of accidents among the Spanish workforce. Based on observational data of the accident rate in all Spanish companies over eleven years (7,519,732 accidents, we classified them in a new risk-injury contingency table (19×19. Through correspondence analysis, we obtained a structure composed of three axes whose combination identifies three separate risk and injury groups, which we used as a general Spanish pattern. The most likely or frequent relationships between the risk and injuries identified in the pattern facilitated the decision-making process in companies at an early stage of risk assessment. Each risk-injury group has its own characteristics, which are understandable within the phenomenological framework of the accident. The main advantages of this model are its potential application to any other country and the feasibility of contrasting different country results. One limiting factor, however, is the need to set a common classification framework for risks and injuries to enhance comparison, a framework that does not exist today. The model aims to manage work-related accidents automatically at any level.Apresentamos aqui um modelo generalizado para o diagnóstico e predição de acidentes na classe de trabalhadores da Espanha. Baseados em dados sobre a frequência de acidentes em todas as companhias da Espanha em 11 anos (7.519.732 acidentes, nós os classificamos em uma nova tabela de contingência risco-injúria (19×19. Através de uma análise por correspondência obtivemos uma estrutura composta por 3 eixos cuja combinação identifica 3 grupos separados de risco e injúria, que nós usamos como um perfil geral na Espanha. As mais prováveis ou frequentes relações entre risco e injúrias identificadas nesse perfil facilitaram o processo de decisão nas companhias em um estágio inicial de apreciação do risco. Cada grupo de risco-injúria tem suas próprias caracter

  15. Prediction of road accidents: A Bayesian hierarchical approach

    DEFF Research Database (Denmark)

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T.;

    2013-01-01

    In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson......-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks...... in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models.Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis...

  16. Decision Tree Model for Non-Fatal Road Accident Injury

    Directory of Open Access Journals (Sweden)

    Fatin Ellisya Sapri

    2017-02-01

    Full Text Available Non-fatal road accident injury has become a great concern as it is associated with injury and sometimes leads to the disability of the victims. Hence, this study aims to develop a model that explains the factors that contribute to non-fatal road accident injury severity. A sample data of 350 non-fatal road accident cases of the year 2016 were obtained from Kota Bharu District Police Headquarters, Kelantan. The explanatory variables include road geometry, collision type, accident time, accident causes, vehicle type, age, airbag, and gender. The predictive data mining techniques of decision tree model and multinomial logistic regression were used to model non-fatal road accident injury severity. Based on accuracy rate, decision tree with CART algorithm was found to be more accurate as compared to the logistic regression model. The factors that significantly contribute to non-fatal traffic crashes injury severity are accident cause, road geometry, vehicle type, age and collision type.

  17. Injury risk prediction for traffic accidents in Porto Alegre/RS, Brazil

    OpenAIRE

    Perone, Christian S.

    2015-01-01

    This study describes the experimental application of Machine Learning techniques to build prediction models that can assess the injury risk associated with traffic accidents. This work uses an freely available data set of traffic accident records that took place in the city of Porto Alegre/RS (Brazil) during the year of 2013. This study also provides an analysis of the most important attributes of a traffic accident that could produce an outcome of injury to the people involved in the accident.

  18. Predictions of structural integrity of steam generator tubes under normal operating, accident, an severe accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Majumdar, S. [Argonne National Lab., IL (United States)

    1997-02-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation was confirmed by further tests at high temperatures, as well as by finite-element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation was confirmed by finite-element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate-sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure was developed and validated by tests under various temperature and pressure loadings that can occur during postulated severe accidents.

  19. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  20. Prediction of hydrogen concentration in containment during severe accidents using fuzzy neural network

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Yeong; Kim, Ju Hyun; Yoo, Kwae Hwan; Na, Man Gyun [Dept. of Nuclear Engineering, Chosun University, Gwangju (Korea, Republic of)

    2015-03-15

    Recently, severe accidents in nuclear power plants (NPPs) have become a global concern. The aim of this paper is to predict the hydrogen buildup within containment resulting from severe accidents. The prediction was based on NPPs of an optimized power reactor 1,000. The increase in the hydrogen concentration in severe accidents is one of the major factors that threaten the integrity of the containment. A method using a fuzzy neural network (FNN) was applied to predict the hydrogen concentration in the containment. The FNN model was developed and verified based on simulation data acquired by simulating MAAP4 code for optimized power reactor 1,000. The FNN model is expected to assist operators to prevent a hydrogen explosion in severe accident situations and manage the accident properly because they are able to predict the changes in the trend of hydrogen concentration at the beginning of real accidents by using the developed FNN model.

  1. Do Cognitive Models Help in Predicting the Severity of Posttraumatic Stress Disorder, Phobia, and Depression after Motor Vehicle Accidents? A Prospective Longitudinal Study

    Science.gov (United States)

    Ehring, Thomas; Ehlers, Anke; Glucksman, Edward

    2008-01-01

    The study investigated the power of theoretically derived cognitive variables to predict posttraumatic stress disorder (PTSD), travel phobia, and depression following injury in a motor vehicle accident (MVA). MVA survivors (N = 147) were assessed at the emergency department on the day of their accident and 2 weeks, 1 month, 3 months, and 6 months…

  2. Modeling accident frequency in Denmark for improving road safety

    DEFF Research Database (Denmark)

    Lyckegaard, Allan; Hels, Tove; Kaplan, Sigal

    Traffic accidents result in huge costs to society in terms of death, injury, lost productivity, and property damage. The main objective of the current study is the development of an accident frequency model that predicts the expected number of accidents on a given road segment, provided...... concerning police recorded accidents, link characteristics of the road network, traffic volumes from the national transport models are merged to estimate the model. Spatial correlation between road sections is taken into account for correcting for unobserved correlation between contiguous locations....

  3. Prediction of structural integrity of steam generator tubes under severe accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Majumdar, S. [Argonne National Lab., IL (United States)

    1999-11-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating and design-basis accident conditions are reviewed. These rate-independent flow stress models are inadequate for predicting failure of steam generator tubes under severe accident conditions because the temperature of the tubes during such accidents can reach as high as 800 C where creep effects become important. Therefore, a creep rupture model for predicting failure was developed and validated by tests on unflawed and flawed specimens containing axial and circumferential flaws and loaded by constant as well as ramped temperature and pressure loadings. Finally, tests were conducted using pressure and temperature histories that are calculated to occur during postulated severe accidents. In all cases, the creep rupture model predicted the failure temperature and time more accurately than the flow stress models. (orig.)

  4. Prediction of vehicle traffic accidents using Bayesian networks

    Directory of Open Access Journals (Sweden)

    Seyed Shamseddin Alizadeh

    2014-06-01

    Full Text Available Every year, thousands of vehicle accidents occur in Iran and result thousands of deaths, injuries and material damage in country. Various factors such as driver characteristics, road characteristics, vehicle characteristics and atmospheric conditions affect the injuries severity of these accidents. In order to reduce the number and severity of these accidents, their analysis and prediction is essential. Currently, the accidents related data are collected which can be used to predict and prevent them. New technologies have enabled humans to collect the large volume of data in continuous and regular ways. One of these methods is to use Bayesian networks. Using the literature review, in this study a new method for analysis and prediction of vehicle traffic accidents is presented. These networks can be used for classification of traffic accidents, hazardous locations of roads and factors affecting accidents severity. Using of the results of the analysis of these networks will help to reduce the number of accidents and their severity. In addition, we can use the results of this analysis for developing of safety regulations.

  5. Do cognitive models help in predicting the severity of posttraumatic stress disorder, phobia and depression after motor vehicle accidents? A prospective longitudinal study

    NARCIS (Netherlands)

    T. Ehring; A. Ehlers; E. Glucksman

    2008-01-01

    The study investigated the power of theoretically derived cognitive variables to predict posttraumatic stress disorder (PTSD), travel phobia, and depression following injury in a motor vehicle accident (MVA). MVA survivors (N 147) were assessed at the emergency department on the day of their acciden

  6. Development of a model to predict flow oscillations in low-flow sodium boiling. [Loss-of-Piping Integrity accidents

    Energy Technology Data Exchange (ETDEWEB)

    Levin, A.E.; Griffith, P.

    1980-04-01

    Tests performed in a small scale water loop showed that voiding oscillations, similar to those observed in sodium, were present in water, as well. An analytical model, appropriate for either sodium or water, was developed and used to describe the water flow behavior. The experimental results indicate that water can be successfully employed as a sodium simulant, and further, that the condensation heat transfer coefficient varies significantly during the growth and collapse of vapor slugs during oscillations. It is this variation, combined with the temperature profile of the unheated zone above the heat source, which determines the oscillatory behavior of the system. The analytical program has produced a model which qualitatively does a good job in predicting the flow behavior in the wake experiment. The amplitude discrepancies are attributable to experimental uncertainties and model inadequacies. Several parameters (heat transfer coefficient, unheated zone temperature profile, mixing between hot and cold fluids during oscillations) are set by the user. Criteria for the comparison of water and sodium experiments have been developed.

  7. The use of Grey System Theory in predicting the road traffic accident in Fars province in Iran

    Directory of Open Access Journals (Sweden)

    Ali Mohammadi

    2011-10-01

    Full Text Available Traffic accidents have become a more and more important factor that restrict the development of economy and threaten the safety of human beings. Considering the complexity and uncertainty of the influencing factors on traffic accidents, traffic accident forecasting can be regarded as a grey system with unknown and known information, so be analyzed by grey system theory. Grey models require only a limited amount of data to estimate the behavior of unknown systems. In this paper, first, the original predicted values of road traffic accidents are separately obtained by the GM (1,1 model, the Verhulst model and the DGM(2,1 model. The results of these models on predicting road traffic accident show that the forecasting accuracy of the GM(1,1 is higher than the Verhulst model and the DGM(2,1 model. Then, the GM(1,1 model is applied to predict road traffic accident in Fars province.

  8. Prediction of hydrogen concentration in nuclear power plant containment under severe accidents using cascaded fuzzy neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Geon Pil; Kim, Dong Yeong; Yoo, Kwae Hwan; Na, Man Gyun, E-mail: magyna@chosun.ac.kr

    2016-04-15

    Highlights: • We present a hydrogen-concentration prediction method in an NPP containment. • The cascaded fuzzy neural network (CFNN) is used in this prediction model. • The CFNN model is much better than the existing FNN model. • This prediction can help prevent severe accidents in NPP due to hydrogen explosion. - Abstract: Recently, severe accidents in nuclear power plants (NPPs) have attracted worldwide interest since the Fukushima accident. If the hydrogen concentration in an NPP containment is increased above 4% in atmospheric pressure, hydrogen combustion will likely occur. Therefore, the hydrogen concentration must be kept below 4%. This study presents the prediction of hydrogen concentration using cascaded fuzzy neural network (CFNN). The CFNN model repeatedly applies FNN modules that are serially connected. The CFNN model was developed using data on severe accidents in NPPs. The data were obtained by numerically simulating the accident scenarios using the MAAP4 code for optimized power reactor 1000 (OPR1000) because real severe accident data cannot be obtained from actual NPP accidents. The root-mean-square error level predicted by the CFNN model is below approximately 5%. It was confirmed that the CFNN model could accurately predict the hydrogen concentration in the containment. If NPP operators can predict the hydrogen concentration in the containment using the CFNN model, this prediction can assist them in preventing a hydrogen explosion.

  9. Predicting cycling accident risk in Brussels: a spatial case-control approach.

    Science.gov (United States)

    Vandenbulcke, Grégory; Thomas, Isabelle; Int Panis, Luc

    2014-01-01

    This paper aims at predicting cycling accident risk for an entire network and identifying how road infrastructure influences cycling safety in the Brussels-Capital Region (Belgium). A spatial Bayesian modelling approach is proposed using a binary dependent variable (accident, no accident at location i) constructed from a case-control strategy. Control sites are sampled along the 'bikeable' road network in function of the potential bicycle traffic transiting in each ward. Risk factors are limited to infrastructure, traffic and environmental characteristics. Results suggest that a high risk is statistically associated with the presence of on-road tram tracks, bridges without cycling facility, complex intersections, proximity to shopping centres or garages, and busy van and truck traffic. Cycle facilities built at intersections and parked vehicles located next to separated cycle facilities are also associated with an increased risk, whereas contraflow cycling is associated with a reduced risk. The cycling accident risk is far from being negligible in points where there is actually no reported cycling accident but where they are yet expected to occur. Hence, mapping predicted accident risks provides planners and policy makers with a useful tool for accurately locating places with a high potential risk even before accidents actually happen. This also provides comprehensible information for orienting cyclists to the safest routes in Brussels.

  10. Key Characteristics of Combined Accident including TLOFW accident for PSA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bo Gyung; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Yoon, Ho Joon [Khalifa University of Science, Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-05-15

    The conventional PSA techniques cannot adequately evaluate all events. The conventional PSA models usually focus on single internal events such as DBAs, the external hazards such as fire, seismic. However, the Fukushima accident of Japan in 2011 reveals that very rare event is necessary to be considered in the PSA model to prevent the radioactive release to environment caused by poor treatment based on lack of the information, and to improve the emergency operation procedure. Especially, the results from PSA can be used to decision making for regulators. Moreover, designers can consider the weakness of plant safety based on the quantified results and understand accident sequence based on human actions and system availability. This study is for PSA modeling of combined accidents including total loss of feedwater (TLOFW) accident. The TLOFW accident is a representative accident involving the failure of cooling through secondary side. If the amount of heat transfer is not enough due to the failure of secondary side, the heat will be accumulated to the primary side by continuous core decay heat. Transients with loss of feedwater include total loss of feedwater accident, loss of condenser vacuum accident, and closure of all MSIVs. When residual heat removal by the secondary side is terminated, the safety injection into the RCS with direct primary depressurization would provide alternative heat removal. This operation is called feed and bleed (F and B) operation. Combined accidents including TLOFW accident are very rare event and partially considered in conventional PSA model. Since the necessity of F and B operation is related to plant conditions, the PSA modeling for combined accidents including TLOFW accident is necessary to identify the design and operational vulnerabilities.The PSA is significant to assess the risk of NPPs, and to identify the design and operational vulnerabilities. Even though the combined accident is very rare event, the consequence of combined

  11. Relating aviation service difficulty reports to accident data for safety trend prediction

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, R.; Hall, R.; Martinez, G.; Uryasev, S.

    1996-03-13

    This work explores the hypothesis that Service Difficulty Reports (SDR - primarily inspection reports) are related to Accident Incident Data System (AIDS - reports primarily compiled from National Transportation Safety Board (NTSB) accident investigations). This work sought and found relations between equipment operability reported in the SDR and aviation safety reported in AIDS. Equipment is not the only factor in aviation accidents, but it is the factor reported in the SDR. Two approaches to risk analysis were used: (1) The conventional method, in which reporting frequencies are taken from a data base (SDR), and used with an aircraft reliability block diagram model of the critical systems to predict aircraft failure, and (2) Shape analysis that uses the magnitude and shape of the SDR distribution compared with the AIDS distribution to predict aircraft failure.

  12. Logit Model of Motorcycle Accidents in the Philippines Considering Personal and Environmental Factors

    Directory of Open Access Journals (Sweden)

    Rosemary R. Seva

    2013-06-01

    Full Text Available The study aims to determine significant personal and environmental variables in predicting motorcycle accidents in the Philippines, compare the results with findings in other countries, and propose possible government interventions. Data were gathered from 177 participants through the use of a survey in a licensing center in the largest city in Metro Manila. Logistic regression was used to predict the likelihood of an accident from variables considered in the model. Three variables were found to be significant predictors of motorcycle accidents: age, driving behavior, and junction type. Younger drivers are more likely to be involved in accidents. The significance of age was unexpected because similar models found this to be insignificant. Driving behavior, specifically, committing violation predicts accident likelihood. Driving at t- and y-junctions also predicts motorcycle accidents. In the Philippines, a unique set of variables were found to predict motorcycle accidents. Although previous studies have established the effect of these variables to accident likelihood, the combination was unforeseen. Government agencies can focus on interventions directed at these three variables.

  13. MELCOR modeling of Fukushima unit 2 accident

    Energy Technology Data Exchange (ETDEWEB)

    Sevon, Tuomo [VTT Technical Research Centre of Finland, Espoo (Finland)

    2014-12-15

    A MELCOR model of the Fukushima Daiichi unit 2 accident was created in order to get a better understanding of the event and to improve severe accident modeling methods. The measured pressure and water level could be reproduced relatively well with the calculation. This required adjusting the RCIC system flow rates and containment leak area so that a good match to the measurements is achieved. Modeling of gradual flooding of the torus room with water that originated from the tsunami was necessary for a satisfactory reproduction of the measured containment pressure. The reactor lower head did not fail in this calculation, and all the fuel remained in the RPV. 13 % of the fuel was relocated from the core area, and all the fuel rods lost their integrity, releasing at least some volatile radionuclides. According to the calculation, about 90 % of noble gas inventory and about 0.08 % of cesium inventory was released to the environment. The release started 78 h after the earthquake, and a second release peak came at 90 h. Uncertainties in the calculation are very large because there is scarce public data available about the Fukushima power plant and because it is not yet possible to inspect the status of the reactor and the containment. Uncertainty in the calculated cesium release is larger than factor of ten.

  14. Study of the Severity of Accidents in Tehran Using Statistical Modeling and Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Hesamaldin Razi

    2013-01-01

    Full Text Available AbstractBackgrounds and Aims: The Tehran province was subject to the second highest incidence of fatalities due to traffic accidents in 1390. Most studies in this field examine rural traffic accidents, but this study is based on the use of logit models and artificial neural networks to evaluate the factors that affect the severity of accidents within the city of Tehran.Materials and Methods: Among the various types of crashes, head-on collisions are specified as the most serious type, which is investigated in this study with the use of Tehran’s accident data. In the modeling process, the severity of the accident is the dependent variable and defined as a binary covariate, which are non-injury accidents and injury accidents. The independent variables are parameters such as the characteristics of the driver, time of the accident, traffic and environmental characteristics. In addition to the prediction accuracy comparison of the two models, the elasticity of the logit model is compared with a sensitivity analysis of the neural network.Results: The results show that the proposed model provides a good estimate of an accident's severity. The explanatory variables that have been determined to be significant in the final models are the driver’s gender, age and education, along with negligence of the traffic rules, inappropriate acceleration, deviation to the left, type of vehicle, pavement conditions, time of the crash and street width.Conclusion: An artificial neural network model can be useful as a statistical model in the analysis of factors that affect the severity of accidents. According to the results, human errors and illiteracy of drivers increase the severity of crashes, and therefore, educating drivers is the main strategy that will reduce accident severity in Iran. Special attention should be given to a driver’s age group, with particular care taken when they are very young.

  15. Study on Microcosmic Accident-predict Model Focused on Vertical Design of Expressway%高速公路纵面设计微观事故预测模型研究

    Institute of Scientific and Technical Information of China (English)

    陈永胜; 高耀华

    2001-01-01

    This paper is focused on the study of accident predict model used for vertical design of expressway. On the basis of fundamental of traffic safety, this model stresses and explores the relationship between traffic safety and specific vertical alignment with join & combination style.%介绍以道路设计要素的安全机理分析为基础,以统计分析为研究手段,针对高速公路纵面设计要素以及相关的要素组合、衔接方式所建立的事故预测模型。该模型体系可为公路安全设计、事故多发路段的甄别提供理论依据。

  16. Relating aviation service difficulty reports to accident data for safety trend prediction

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, R.R.; Hall, R.E.; Martinez-Guridi, G.; Uryasev, S. [Brookhaven National Lab., Upton, NY (United States); Sampath, S.G. [Federal Aviation Administration, Atlantic City, NJ (United States)

    1996-10-01

    A synthetic model of scheduled-commercial U.S. aviation fatalities was constructed from linear combinations of the time-spectra of critical systems reporting using 5.5 years of Service Difficulty Reports (SDR){sup 2} and Accident Incident Data System (AIDS) records{sup 3}. This model, used to predict near-future trends in aviation accidents, was tested by using the first 36 months of data to construct the synthetic model which was used to predict fatalities during the following eight months. These predictions were tested by comparison with the fatality data. A reliability block diagram (RBD) and third-order extrapolations also were used as predictive models and compared with actuality. The synthetic model was the best predictor because of its use of systems data. Other results of the study are a database of service difficulties for major aviation systems, and a rank ordering of systems according to their contribution to the synthesis. 4 refs., 8 figs., 3 tabs.

  17. Characterizing the Severe Turbulence Environments Associated With Commercial Aviation Accidents: A Real-Time Turbulence Model (RTTM) Designed for the Operational Prediction of Hazardous Aviation Turbulence Environments

    Science.gov (United States)

    Kaplan, Michael L.; Lux, Kevin M.; Cetola, Jeffrey D.; Huffman, Allan W.; Riordan, Allen J.; Slusser, Sarah W.; Lin, Yuh-Lang; Charney, Joseph J.; Waight, Kenneth T.

    2004-01-01

    Real-time prediction of environments predisposed to producing moderate-severe aviation turbulence is studied. We describe the numerical model and its postprocessing system designed for said prediction of environments predisposed to severe aviation turbulence as well as presenting numerous examples of its utility. The numerical model is MASS version 5.13, which is integrated over three different grid matrices in real time on a university work station in support of NASA Langley Research Center s B-757 turbulence research flight missions. The postprocessing system includes several turbulence-related products, including four turbulence forecasting indices, winds, streamlines, turbulence kinetic energy, and Richardson numbers. Additionally, there are convective products including precipitation, cloud height, cloud mass fluxes, lifted index, and K-index. Furthermore, soundings, sounding parameters, and Froude number plots are also provided. The horizontal cross-section plot products are provided from 16 000 to 46 000 ft in 2000-ft intervals. Products are available every 3 hours at the 60- and 30-km grid interval and every 1.5 hours at the 15-km grid interval. The model is initialized from the NWS ETA analyses and integrated two times a day.

  18. Grey-Markov Model for Road Accidents Forecasting

    Institute of Scientific and Technical Information of China (English)

    李相勇; 严余松; 蒋葛夫

    2003-01-01

    In order to improve the forecasting precision of road accidents, by introducing Markov chains forecasting method, a grey-Markov model for forecasting road accidents is established based on grey forecasting method. The model combines the advantages of both grey forecasting method and Markov chains forecasting method, overcomes the influence of random fluctuation data on forecasting precision and widens the application scope of the grey forecasting. An application example is conducted to evaluate the grey-Markov model, which shows that the precision of the grey-Markov model is better than that of grey model in forecasting road accidents.

  19. Applying Functional Modeling for Accident Management of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lind, Morten; Zhang Xinxin [Harbin Engineering University, Harbin (China)

    2014-08-15

    The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow Modeling is given and a detailed presentation of the foundational means-end concepts is presented and the conditions for proper use in modelling accidents are identified. It is shown that Multilevel Flow Modeling can be used for modelling and reasoning about design basis accidents. Its possible role for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented.

  20. Applying Functional Modeling for Accident Management of Nucler Power Plant

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper investigates applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...... for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented....

  1. Modified ensemble Kalman filter for nuclear accident atmospheric dispersion: prediction improved and source estimated.

    Science.gov (United States)

    Zhang, X L; Su, G F; Yuan, H Y; Chen, J G; Huang, Q Y

    2014-09-15

    Atmospheric dispersion models play an important role in nuclear power plant accident management. A reliable estimation of radioactive material distribution in short range (about 50 km) is in urgent need for population sheltering and evacuation planning. However, the meteorological data and the source term which greatly influence the accuracy of the atmospheric dispersion models are usually poorly known at the early phase of the emergency. In this study, a modified ensemble Kalman filter data assimilation method in conjunction with a Lagrangian puff-model is proposed to simultaneously improve the model prediction and reconstruct the source terms for short range atmospheric dispersion using the off-site environmental monitoring data. Four main uncertainty parameters are considered: source release rate, plume rise height, wind speed and wind direction. Twin experiments show that the method effectively improves the predicted concentration distribution, and the temporal profiles of source release rate and plume rise height are also successfully reconstructed. Moreover, the time lag in the response of ensemble Kalman filter is shortened. The method proposed here can be a useful tool not only in the nuclear power plant accident emergency management but also in other similar situation where hazardous material is released into the atmosphere.

  2. Applying Functional Modeling for Accident Management of Nuclear Power Plant

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...

  3. Predicting Posttraumatic Stress Symptoms in Children after Road Traffic Accidents

    Science.gov (United States)

    Landolt, Markus A.; Vollrath, Margarete; Timm, Karin; Gnehm, Hanspeter E.; Sennhauser, Felix H.

    2005-01-01

    Objective: To prospectively assess the prevalence, course, and predictors of posttraumatic stress symptoms (PTSSs) in children after road traffic accidents (RTAs). Method: Sixty-eight children (6.5-14.5 years old) were interviewed 4-6 weeks and 12 months after an RTA with the Child PTSD Reaction Index (response rate 58.6%). Their mothers (n = 60)…

  4. BMX bicycles: accident comparison with other models.

    OpenAIRE

    1985-01-01

    A comparison has been made between BMX bicycle accidents and those occurring when children ride other types of bicycle. The injuries sustained are compared to see if the clinical impressions that BMX are more dangerous, and produce more facial injuries, are correct. This was found not to be true as half the children involved rode BMX bicycles, and the injuries sustained were similar to those occurring to non BMX riders. BMX riders had a lower proportion of serious injuries than riders of raci...

  5. Comparative modeling analyses of Cs-137 fate in the rivers impacted by Chernobyl and Fukushima accidents

    Energy Technology Data Exchange (ETDEWEB)

    Zheleznyak, M.; Kivva, S. [Institute of Environmental Radioactivity, Fukushima University (Japan)

    2014-07-01

    The consequences of two largest nuclear accidents of the last decades - at Chernobyl Nuclear Power Plant (ChNPP) (1986) and at Fukushima Daiichi NPP (FDNPP) (2011) clearly demonstrated that radioactive contamination of water bodies in vicinity of NPP and on the waterways from it, e.g., river- reservoir water after Chernobyl accident and rivers and coastal marine waters after Fukushima accident, in the both cases have been one of the main sources of the public concerns on the accident consequences. The higher weight of water contamination in public perception of the accidents consequences in comparison with the real fraction of doses via aquatic pathways in comparison with other dose components is a specificity of public perception of environmental contamination. This psychological phenomenon that was confirmed after these accidents provides supplementary arguments that the reliable simulation and prediction of the radionuclide dynamics in water and sediments is important part of the post-accidental radioecological research. The purpose of the research is to use the experience of the modeling activities f conducted for the past more than 25 years within the Chernobyl affected Pripyat River and Dnieper River watershed as also data of the new monitoring studies in Japan of Abukuma River (largest in the region - the watershed area is 5400 km{sup 2}), Kuchibuto River, Uta River, Niita River, Natsui River, Same River, as also of the studies on the specific of the 'water-sediment' {sup 137}Cs exchanges in this area to refine the 1-D model RIVTOX and 2-D model COASTOX for the increasing of the predictive power of the modeling technologies. The results of the modeling studies are applied for more accurate prediction of water/sediment radionuclide contamination of rivers and reservoirs in the Fukushima Prefecture and for the comparative analyses of the efficiency of the of the post -accidental measures to diminish the contamination of the water bodies. Document

  6. Usefulness of high resolution coastal models for operational oil spill forecast: the Full City accident

    Science.gov (United States)

    Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.

    2011-06-01

    Oil spill modeling is considered to be an important decision support system (DeSS) useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas implying that low resolution basin scale ocean models is of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the Full City accident on the Norwegian south coast and compare three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws but including an analysis based on a higher resolution model (1.5 km resolution) for the area the model system show results that compare well with observations. The study also shows that an ensemble using three different models is useful when predicting/analyzing oil spill in coastal areas.

  7. Estimation of traffic accident costs: a prompted model.

    Science.gov (United States)

    Hejazi, Rokhshad; Shamsudin, Mad Nasir; Radam, Alias; Rahim, Khalid Abdul; Ibrahim, Zelina Zaitun; Yazdani, Saeed

    2013-01-01

    Traffic accidents are the reason for 25% of unnatural deaths in Iran. The main objective of this study is to find a simple model for the estimation of economic costs especially in Islamic countries (like Iran) in a straightforward manner. The model can show the magnitude of traffic accident costs with monetary equivalent. Data were collected from different sources that included traffic police records, insurance companies and hospitals. The conceptual framework, in our study, was based on the method of Ayati. He used this method for the estimation of economic costs in Iran. We promoted his method via minimum variables. Our final model has only three available variables which can be taken from insurance companies and police records. The running model showed that the traffic accident costs were US$2.2 million in 2007 for our case study route.

  8. Pilot study of dynamic Bayesian networks approach for fault diagnostics and accident progression prediction in HTR-PM

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Yunfei; Tong, Jiejuan; Zhang, Liguo, E-mail: lgzhang@tsinghua.edu.cn; Zhang, Qin

    2015-09-15

    Highlights: • Dynamic Bayesian network is used to diagnose and predict accident progress in HTR-PM. • Dynamic Bayesian network model of HTR-PM is built based on detailed system analysis. • LOCA Simulations validate the above model even if part monitors are lost or false. - Abstract: The first high-temperature-reactor pebble-bed demonstration module (HTR-PM) is under construction currently in China. At the same time, development of a system that is used to support nuclear emergency response is in progress. The supporting system is expected to complete two tasks. The first one is diagnostics of the fault in the reactor based on abnormal sensor measurements obtained. The second one is prognostic of the accident progression based on sensor measurements obtained and operator actions. Both tasks will provide valuable guidance for emergency staff to take appropriate protective actions. Traditional method for the two tasks relies heavily on expert judgment, and has been proven to be inappropriate in some cases, such as Three Mile Island accident. To better perform the two tasks, dynamic Bayesian networks (DBN) is introduced in this paper and a pilot study based on the approach is carried out. DBN is advantageous in representing complex dynamic systems and taking full consideration of evidences obtained to perform diagnostics and prognostics. Pearl's loopy belief propagation (LBP) algorithm is recommended for diagnostics and prognostics in DBN. The DBN model of HTR-PM is created based on detailed system analysis and accident progression analysis. A small break loss of coolant accident (SBLOCA) is selected to illustrate the application of the DBN model of HTR-PM in fault diagnostics (FD) and accident progression prognostics (APP). Several advantages of DBN approach compared with other techniques are discussed. The pilot study lays the foundation for developing the nuclear emergency response supporting system (NERSS) for HTR-PM.

  9. Usefulness of high resolution coastal models for operational oil spill forecast: the "Full City" accident

    Science.gov (United States)

    Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.

    2011-11-01

    Oil spill modeling is considered to be an important part of a decision support system (DeSS) for oil spill combatment and is useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas, implying that low resolution basin scale ocean models are of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the "Full City" accident on the Norwegian south coast and compare operational simulations from three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws, but by applying ocean forcing data of higher resolution (1.5 km resolution), the model system shows results that compare well with observations. The study also shows that an ensemble of results from the three different models is useful when predicting/analyzing oil spill in coastal areas.

  10. Usefulness of high resolution coastal models for operational oil spill forecast: the "Full City" accident

    Directory of Open Access Journals (Sweden)

    G. Broström

    2011-11-01

    Full Text Available Oil spill modeling is considered to be an important part of a decision support system (DeSS for oil spill combatment and is useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas, implying that low resolution basin scale ocean models are of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the "Full City" accident on the Norwegian south coast and compare operational simulations from three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws, but by applying ocean forcing data of higher resolution (1.5 km resolution, the model system shows results that compare well with observations. The study also shows that an ensemble of results from the three different models is useful when predicting/analyzing oil spill in coastal areas.

  11. A MELCOR model of Fukushima Daiichi Unit 3 accident

    Energy Technology Data Exchange (ETDEWEB)

    Sevón, Tuomo, E-mail: tuomo.sevon@vtt.fi

    2015-04-01

    Highlights: • A MELCOR model of the Fukushima Unit 3 accident was developed. • The MELCOR input file is published as electronic supplementary data with this paper. • Reactor pressure vessel lower head failed about 53 h after the earthquake. • 70% of fuel was discharged from reactor to containment. • 0.95% of cesium inventory was released to the environment. - Abstract: A MELCOR model of the Fukushima Daiichi Unit 3 accident was developed. The model is based on publicly available information, and the MELCOR input file is published as electronic supplementary data with this paper. According to the calculation, the reactor pressure vessel lower head failed about 53 h after the earthquake. At the end of the calculation, 30% of the fuel was still inside the reactor and 70% had been discharged to the containment. Almost all of the radioactive noble gases and 0.95% of the cesium inventory were released to the environment during the accident.

  12. Prediction of the reactor vessel water level using fuzzy neural networks in severe accident circumstance of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Soon Ho; Kim, Dae Seop; Kim, Jae Hwan; Na, Man Gyun [Dept. of Nuclear Engineering, Chosun University, Gwangju (Korea, Republic of)

    2014-06-15

    Safety-related parameters are very important for confirming the status of a nuclear power plant. In particular, the reactor vessel water level has a direct impact on the safety fortress by confirming reactor core cooling. In this study, the reactor vessel water level under the condition of a severe accident, where the water level could not be measured, was predicted using a fuzzy neural network (FNN). The prediction model was developed using training data, and validated using independent test data. The data was generated from simulations of the optimized power reactor 1000 (OPR1000) using MAAP4 code. The informative data for training the FNN model was selected using the subtractive clustering method. The prediction performance of the reactor vessel water level was quite satisfactory, but a few large errors were occasionally observed. To check the effect of instrument errors, the prediction model was verified using data containing artificially added errors. The developed FNN model was sufficiently accurate to be used to predict the reactor vessel water level in severe accident situations where the integrity of the reactor vessel water level sensor is compromised. Furthermore, if the developed FNN model can be optimized using a variety of data, it should be possible to predict the reactor vessel water level precisely.

  13. Quantifying safety benefit of winter road maintenance: accident frequency modeling.

    Science.gov (United States)

    Usman, Taimur; Fu, Liping; Miranda-Moreno, Luis F

    2010-11-01

    This research presents a modeling approach to investigate the association of the accident frequency during a snow storm event with road surface conditions, visibility and other influencing factors controlling for traffic exposure. The results have the premise to be applied for evaluating different maintenance strategies using safety as a performance measure. As part of this approach, this research introduces a road surface condition index as a surrogate measure of the commonly used friction measure to capture different road surface conditions. Data from various data sources, such as weather, road condition observations, traffic counts and accidents, are integrated and used to test three event-based models including the Negative Binomial model, the generalized NB model and the zero inflated NB model. These models are compared for their capability to explain differences in accident frequencies between individual snow storms. It was found that the generalized NB model best fits the data, and is most capable of capturing heterogeneity other than excess zeros. Among the main results, it was found that the road surface condition index was statistically significant influencing the accident occurrence. This research is the first showing the empirical relationship between safety and road surface conditions at a disaggregate level (event-based), making it feasible to quantify the safety benefits of alternative maintenance goals and methods.

  14. Catastrophe model for the exposure to blood-borne pathogens and other accidents in health care settings.

    Science.gov (United States)

    Guastello, S J; Gershon, R R; Murphy, L R

    1999-11-01

    Catastrophe models, which describe and predict discontinuous changes in system state variables, were used to model the exposure to blood and bodily fluids and more conventional occupational accidents among 1708 health care workers. Workers at three hospitals completed a survey measuring HIV-relevant exposures (needlesticks, cuts, splashes, contact with open wounds), the accident rate for broadly-defined injuries, and several occupationally relevant themes: safety climate, shift work, depression symptoms, work pace, verbal abuse, and professional group membership. A cusp (cubic polynomial) model predicting HIV-relevant exposures specifically was more accurate (R2 = 0.56) than a comparable linear model containing the same variables (R2 = 0.07). Some of the foregoing variables predisposed workers to greater differences in HIV-relevant and general accident exposures: shiftwork, climate, depressive symptoms, and work pace. Other variables governed how close an individual was to a critical threshold where a harmful incident would take place: verbal abuse, professional group membership. Similarly, a cusp model for accident incidents predicted from HIV-relevant exposures and occupational variables was also more accurate (R2 = 0.75) than comparison models. Two variables predisposed the worker to a greater accident risk: depression symptoms and shift work. Four other variables predisposed the worker to lesser accident risk: job satisfaction, safety climate, environmental stressors, and work pace. Compliance with the universal precautions and HIV-related training were not relevant to either of the models.

  15. Radiative heat transfer modelling in a PWR severe accident sequence

    Energy Technology Data Exchange (ETDEWEB)

    Magali Zabiego; Florian Fichot [Institut de Radioprotection et de Surete Nucleaire - BP 3 - 13115 Saint-paul-Lez-Durance (France); Pablo Rubiolo [Westinghouse Science and Technology - 1344 Beulah Road - Pittsburgh - PA 15235 (United States)

    2005-07-01

    Full text of publication follows: The present study is devoted to the estimation of the radiative heat transfers during a severe accident sequence in a Pressurized Water Reactor. In such a situation, the residual nuclear power released by the fuel rods can not be evacuated and heats up the core. As a result, the cylindrical rods and the structures initially composing the core undergo a degradation process: swelling, breaking or melting of the rods and structures and eventual collapse to form a heap of fragments called a debris bed. As the solid matrix loses its original shape, the core geometry continuously evolves from standing, regularly-spaced cylinders to a non-homogeneous system including deformed remaining rods and structures and debris particles. To predict this type of sequence, the ICARE/CATHARE software [1] is developed by IRSN. Since the temperatures can reach values greater than 3000 K, it was of major interest to provide the code with an accurate radiative transfer model usable whatever the geometry of the system. Considering the size of a reactor core compared to the mean penetration length of radiation, the core can be seen as an optically thick medium. This observation led us to use the diffusion approximation to treat the radiation propagation. In this approach, the radiative flux is calculated in a way similar to thermal conduction: q{sub r} = [K{sub e}].{nabla}T where [K{sub e}] is the equivalent conductivity tensor of the system accounting for thermal and radiative transfer. An homogenization technique is applied to estimate the equivalent conductivity. Given the temperature level, the radiative contribution to the equivalent conductivity tensor quickly becomes dominant. This model was described earlier in [2] in which it was shown that an equivalent conductivity can be continuously calculated in the system when the geometry evolves from standing regular cylinder rods to swollen or broken ones, surrounded or not by a film of liquid materials, to

  16. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger

    OpenAIRE

    Moray, Neville; Groeger, John; Stanton, Neville

    2016-01-01

    This paper shows how to combine field observations, experimental data, and mathematical modeling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example we consider a major railway accident. In 1999 a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, "black box" data, and accident and engineering reports, to construct a case history of the accident. We show how t...

  17. Catastrophe model of the accident process, safety climate, and anxiety.

    Science.gov (United States)

    Guastello, Stephen J; Lynn, Mark

    2014-04-01

    This study aimed (a) to address the evidence for situational specificity in the connection between safety climate to occupational accidents, (b) to resolve similar issues between anxiety and accidents, (c) to expand and develop the concept of safety climate to include a wider range of organizational constructs, (d) to assess a cusp catastrophe model for occupational accidents where safety climate and anxiety are treated as bifurcation variables, and environ-mental hazards are asymmetry variables. Bifurcation, or trigger variables can have a positive or negative effect on outcomes, depending on the levels of asymmetry, or background variables. The participants were 1262 production employees of two steel manufacturing facilities who completed a survey that measured safety management, anxiety, subjective danger, dysregulation, stressors and hazards. Nonlinear regression analyses showed, for this industry, that the accident process was explained by a cusp catastrophe model in which safety management and anxiety were bifurcation variables, and hazards, age and experience were asymmetry variables. The accuracy of the cusp model (R2 = .72) exceeded that of the next best log-linear model (R2 = .08) composed from the same survey variables. The results are thought to generalize to any industry where serious injuries could occur, although situationally specific effects should be anticipated as well.

  18. Prediction of accidents at full green and green arrow traffic lights in Switzerland with the aid of configuration-specific features.

    Science.gov (United States)

    Hubacher, Markus; Allenbach, Roland

    2004-09-01

    In this study it was endeavored to predict full green and green arrow accidents at traffic lights, using configuration-specific features. This was done using the statistical method known as Poisson regression. A total of 45 sets of traffic lights (criteria: in an urban area, with four approach roads) with 178 approach roads were investigated (the data from two approach roads was unable to be used). Configuration-specific features were surveyed on all approach roads (characteristics of traffic lanes, road signs, traffic lights, etc.), traffic monitored and accidents (full green and green arrow) recorded over a period of 5 consecutive years. It was demonstrated that only between 23 and 34% of variance could be explained with the models predicting both types of accidents. In green arrow accidents, the approach road topography was found to be the major contributory factor to an accident: if the approach road slopes downwards, the risk of a green arrow accident is approximately five and a half times greater (relative risk, RR = 5.56) than on a level or upward sloping approach road. With full green accidents, obstructed vision plays the major role: where vision can be obstructed by vehicles turning off, the accident risk is eight times greater (RR = 8.08) than where no comparable obstructed vision is possible. From the study it emerges that technical features of traffic lights are not able to control a driver's actions in such a way as to eradicate error. Other factors, in particular the personal characteristics of the driver (age, sex, etc.) and accident circumstances (lighting, road conditions, etc.), are likely to make an important contribution to explaining how an accident occurs.

  19. Consequences in Norway after a hypothetical accident at Sellafield - Predicted impacts on the environment.

    Energy Technology Data Exchange (ETDEWEB)

    Thoerring, H.; Liland, A.

    2010-12-15

    This report deals with the environmental consequences in Norway after a hypothetical accident at Sellafield. The investigation is limited to the terrestrial environment, and focus on animals grazing natural pastures, plus wild berries and fungi. Only 137Cs is considered. The predicted consequences are severe, in particular for mutton and goat milk production. (Author)

  20. A Bayesian ensemble of sensitivity measures for severe accident modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Di Maio, Francesco, E-mail: francesco.dimaio@polimi.it [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Vagnoli, Matteo [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Zio, Enrico [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Chair on System Science and Energetic Challenge, Fondation EDF – Electricite de France Ecole Centrale, Paris, and Supelec, Paris (France); Pourgol-Mohammad, Mohammad [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of)

    2015-12-15

    Highlights: • We propose a sensitivity analysis (SA) method based on a Bayesian updating scheme. • The Bayesian updating schemes adjourns an ensemble of sensitivity measures. • Bootstrap replicates of a severe accident code output are fed to the Bayesian scheme. • The MELCOR code simulates the fission products release of LOFT LP-FP-2 experiment. • Results are compared with those of traditional SA methods. - Abstract: In this work, a sensitivity analysis framework is presented to identify the relevant input variables of a severe accident code, based on an incremental Bayesian ensemble updating method. The proposed methodology entails: (i) the propagation of the uncertainty in the input variables through the severe accident code; (ii) the collection of bootstrap replicates of the input and output of limited number of simulations for building a set of finite mixture models (FMMs) for approximating the probability density function (pdf) of the severe accident code output of the replicates; (iii) for each FMM, the calculation of an ensemble of sensitivity measures (i.e., input saliency, Hellinger distance and Kullback–Leibler divergence) and the updating when a new piece of evidence arrives, by a Bayesian scheme, based on the Bradley–Terry model for ranking the most relevant input model variables. An application is given with respect to a limited number of simulations of a MELCOR severe accident model describing the fission products release in the LP-FP-2 experiment of the loss of fluid test (LOFT) facility, which is a scaled-down facility of a pressurized water reactor (PWR).

  1. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  2. Accident sequence precursor analysis level 2/3 model development

    Energy Technology Data Exchange (ETDEWEB)

    Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Galyean, W.J.; Brownson, D.A. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1997-02-01

    The US Nuclear Regulatory Commission`s Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models.

  3. Advanced accident sequence precursor analysis level 2 models

    Energy Technology Data Exchange (ETDEWEB)

    Galyean, W.J.; Brownson, D.A.; Rempe, J.L. [and others

    1996-03-01

    The U.S. Nuclear Regulatory Commission Accident Sequence Precursor program pursues the ultimate objective of performing risk significant evaluations on operational events (precursors) occurring in commercial nuclear power plants. To achieve this objective, the Office of Nuclear Regulatory Research is supporting the development of simple probabilistic risk assessment models for all commercial nuclear power plants (NPP) in the U.S. Presently, only simple Level 1 plant models have been developed which estimate core damage frequencies. In order to provide a true risk perspective, the consequences associated with postulated core damage accidents also need to be considered. With the objective of performing risk evaluations in an integrated and consistent manner, a linked event tree approach which propagates the front end results to back end was developed. This approach utilizes simple plant models that analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude and timing of a radioactive release to the environment, and calculate the consequences for a given release. Detailed models and results from previous studies, such as the NUREG-1150 study, are used to quantify these simple models. These simple models are then linked to the existing Level 1 models, and are evaluated using the SAPHIRE code. To demonstrate the approach, prototypic models have been developed for a boiling water reactor, Peach Bottom, and a pressurized water reactor, Zion.

  4. BRAIN INJURY BIOMECHANICS IN REAL WORLD VEHICLE ACCIDENT USING MATHEMATICAL MODELS

    Institute of Scientific and Technical Information of China (English)

    YANG Jikuang; XU Wei; OTTE Dietmar

    2008-01-01

    This paper aims at investigating brain injury mechanisms and predicting head injuries in real world accidents. For this purpose, a 3D human head finite element model (HBM-head) was developed based on head-brain anatomy. The HBM head model was validated with two experimental tests. Then the head finite element(FE) model and a multi-body system (MBS) model were used to carry out reconstructions of real world vehicle-pedestrian accidents and brain injuries. The MBS models were used for calculating the head impact conditions in vehicle impacts. The HBM-head model was used for calculating the injury related physical parameters, such as intracranial pressure, stress, and strain. The calculated intracranial pressure and strain distribution were correlated with the injury outcomes observed from accidents. It is shown that this model can predict the intracranial biomechanical response and calculate the injury related physical parameters. The head FE model has good biofidelity and will be a valuable tool for the study of injury mechanisms and the tolerance level of the brain.

  5. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  6. System analysis with improved thermo-mechanical fuel rod models for modeling current and advanced LWR materials in accident scenarios

    Science.gov (United States)

    Porter, Ian Edward

    A nuclear reactor systems code has the ability to model the system response in an accident scenario based on known initial conditions at the onset of the transient. However, there has been a tendency for these codes to lack the detailed thermo-mechanical fuel rod response models needed for accurate prediction of fuel rod failure. This proposed work will couple today's most widely used steady-state (FRAPCON) and transient (FRAPTRAN) fuel rod models with a systems code TRACE for best-estimate modeling of system response in accident scenarios such as a loss of coolant accident (LOCA). In doing so, code modifications will be made to model gamma heating in LWRs during steady-state and accident conditions and to improve fuel rod thermal/mechanical analysis by allowing axial nodalization of burnup-dependent phenomena such as swelling, cladding creep and oxidation. With the ability to model both burnup-dependent parameters and transient fuel rod response, a fuel dispersal study will be conducted using a hypothetical accident scenario under both PWR and BWR conditions to determine the amount of fuel dispersed under varying conditions. Due to the fuel fragmentation size and internal rod pressure both being dependent on burnup, this analysis will be conducted at beginning, middle and end of cycle to examine the effects that cycle time can play on fuel rod failure and dispersal. Current fuel rod and system codes used by the Nuclear Regulatory Commission (NRC) are compilations of legacy codes with only commonly used light water reactor materials, Uranium Dioxide (UO2), Mixed Oxide (U/PuO 2) and zirconium alloys. However, the events at Fukushima Daiichi and Three Mile Island accident have shown the need for exploration into advanced materials possessing improved accident tolerance. This work looks to further modify the NRC codes to include silicon carbide (SiC), an advanced cladding material proposed by current DOE funded research on accident tolerant fuels (ATF). Several

  7. The Application of Data Mining Technology to Build a Forecasting Model for Classification of Road Traffic Accidents

    Directory of Open Access Journals (Sweden)

    Yau-Ren Shiau

    2015-01-01

    Full Text Available With the ever-increasing number of vehicles on the road, traffic accidents have also increased, resulting in the loss of lives and properties, as well as immeasurable social costs. The environment, time, and region influence the occurrence of traffic accidents. The life and property loss is expected to be reduced by improving traffic engineering, education, and administration of law and advocacy. This study observed 2,471 traffic accidents which occurred in central Taiwan from January to December 2011 and used the Recursive Feature Elimination (RFE of Feature Selection to screen the important factors affecting traffic accidents. It then established models to analyze traffic accidents with various methods, such as Fuzzy Robust Principal Component Analysis (FRPCA, Backpropagation Neural Network (BPNN, and Logistic Regression (LR. The proposed model aims to probe into the environments of traffic accidents, as well as the relationships between the variables of road designs, rule-violation items, and accident types. The results showed that the accuracy rate of classifiers FRPCA-BPNN (85.89% and FRPCA-LR (85.14% combined with FRPCA is higher than that of BPNN (84.37% and LR (85.06% by 1.52% and 0.08%, respectively. Moreover, the performance of FRPCA-BPNN and FRPCA-LR combined with FRPCA in classification prediction is better than that of BPNN and LR.

  8. WHEN MODEL MEETS REALITY – A REVIEW OF SPAR LEVEL 2 MODEL AGAINST FUKUSHIMA ACCIDENT

    Energy Technology Data Exchange (ETDEWEB)

    Zhegang Ma

    2013-09-01

    The Standardized Plant Analysis Risk (SPAR) models are a set of probabilistic risk assessment (PRA) models used by the Nuclear Regulatory Commission (NRC) to evaluate the risk of operations at U.S. nuclear power plants and provide inputs to risk informed regulatory process. A small number of SPAR Level 2 models have been developed mostly for feasibility study purpose. They extend the Level 1 models to include containment systems, group plant damage states, and model containment phenomenology and accident progression in containment event trees. A severe earthquake and tsunami hit the eastern coast of Japan in March 2011 and caused significant damages on the reactors in Fukushima Daiichi site. Station blackout (SBO), core damage, containment damage, hydrogen explosion, and intensive radioactivity release, which have been previous analyzed and assumed as postulated accident progression in PRA models, now occurred with various degrees in the multi-units Fukushima Daiichi site. This paper reviews and compares a typical BWR SPAR Level 2 model with the “real” accident progressions and sequences occurred in Fukushima Daiichi Units 1, 2, and 3. It shows that the SPAR Level 2 model is a robust PRA model that could very reasonably describe the accident progression for a real and complicated nuclear accident in the world. On the other hand, the comparison shows that the SPAR model could be enhanced by incorporating some accident characteristics for better representation of severe accident progression.

  9. Dynamic modelling of radionuclide uptake by marine biota: application to the Fukushima nuclear power plant accident.

    Science.gov (United States)

    Vives i Batlle, Jordi

    2016-01-01

    The dynamic model D-DAT was developed to study the dynamics of radionuclide uptake and turnover in biota and sediments in the immediate aftermath of the Fukushima accident. This dynamics is determined by the interplay between the residence time of radionuclides in seawater/sediments and the biological half-lives of elimination by the biota. The model calculates time-variable activity concentration of (131)I, (134)Cs, (137)Cs and (90)Sr in seabed sediment, fish, crustaceans, molluscs and macroalgae from surrounding activity concentrations in seawater, with which to derive internal and external dose rates. A central element of the model is the inclusion of dynamic transfer of radionuclides to/from sediments by factorising the depletion of radionuclides adsorbed onto suspended particulates, molecular diffusion, pore water mixing and bioturbation, represented by a simple set of differential equations coupled with the biological uptake/turnover processes. In this way, the model is capable of reproducing activity concentration in sediment more realistically. The model was used to assess the radiological impact of the Fukushima accident on marine biota in the acute phase of the accident. Sediment and biota activity concentrations are within the wide range of actual monitoring data. Activity concentrations in marine biota are thus shown to be better calculated by a dynamic model than with the simpler equilibrium approach based on concentration factors, which tends to overestimate for the acute accident period. Modelled dose rates from external exposure from sediment are also significantly below equilibrium predictions. The model calculations confirm previous studies showing that radioactivity levels in marine biota have been generally below the levels necessary to cause a measurable effect on populations. The model was used in mass-balance mode to calculate total integrated releases of 103, 30 and 3 PBq for (131)I, (137)Cs and (90)Sr, reasonably in line with previous

  10. A Bayesian hierarchical model for accident and injury surveillance.

    Science.gov (United States)

    MacNab, Ying C

    2003-01-01

    This article presents a recent study which applies Bayesian hierarchical methodology to model and analyse accident and injury surveillance data. A hierarchical Poisson random effects spatio-temporal model is introduced and an analysis of inter-regional variations and regional trends in hospitalisations due to motor vehicle accident injuries to boys aged 0-24 in the province of British Columbia, Canada, is presented. The objective of this article is to illustrate how the modelling technique can be implemented as part of an accident and injury surveillance and prevention system where transportation and/or health authorities may routinely examine accidents, injuries, and hospitalisations to target high-risk regions for prevention programs, to evaluate prevention strategies, and to assist in health planning and resource allocation. The innovation of the methodology is its ability to uncover and highlight important underlying structure of the data. Between 1987 and 1996, British Columbia hospital separation registry registered 10,599 motor vehicle traffic injury related hospitalisations among boys aged 0-24 who resided in British Columbia, of which majority (89%) of the injuries occurred to boys aged 15-24. The injuries were aggregated by three age groups (0-4, 5-14, and 15-24), 20 health regions (based of place-of-residence), and 10 calendar years (1987 to 1996) and the corresponding mid-year population estimates were used as 'at risk' population. An empirical Bayes inference technique using penalised quasi-likelihood estimation was implemented to model both rates and counts, with spline smoothing accommodating non-linear temporal effects. The results show that (a) crude rates and ratios at health region level are unstable, (b) the models with spline smoothing enable us to explore possible shapes of injury trends at both the provincial level and the regional level, and (c) the fitted models provide a wealth of information about the patterns (both over space and time

  11. Thermal-hydraulic modeling of reactivity accidents in MTR reactors

    Directory of Open Access Journals (Sweden)

    Khater Hany

    2006-01-01

    Full Text Available This paper describes the development of a dynamic model for the thermal-hydraulic analysis of MTR research reactors during a reactivity insertion accident. The model is formulated for coupling reactor kinetics with feedback reactivity and reactor core thermal-hydraulics. To represent the reactor core, two types of channels are considered, average and hot channels. The developed computer program is compiled and executed on a personal computer, using the FORTRAN language. The model is validated by safety-related benchmark calculations for MTR-TYPE reactors of IAEA 10 MW generic reactor for both slow and fast reactivity insertion transients. A good agreement is shown between the present model and the benchmark calculations. Then, the model is used for simulating the uncontrolled withdrawal of a control rod of an ETRR-2 reactor in transient with over power scram trip. The model results for ETRR-2 are analyzed and discussed.

  12. Development of a post accident analysis model for KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Chang, W. P.; Ha, G. S.; Jeong, H. Y.; Kwon, Y. M.; Heo, S.; Lee, Y. B. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    An ultimate safety measure of the KALIMER depends on the inherent safety, which have the core maintain a negative reactivity during any accident periods. In order to secure the integrity of a fuel rod, the void reactivity feedback under sodium boiling must be analyzed. Even though the KALIMER design might not allow boiling at any circumstance, sodium boiling would be possible under HCDA (Hypothetical Core Disruptive Accident) initiating events which are represented by UTOP (Unprotected Transient Over Power), ULOF (Unprotected Loss Of Flow), ULOHS (Unprotected Loss Of Heat Sink), or sudden flow channel blockage, due to power excursion caused by the reactivity feedback. The slug and annular flow regimes tend to prevail for the boiling of a liquid-metal coolant such as sodium near the atmospheric pressure. In contrast, the bubbly flow is typical under a high pressure in light water reactors. This phenomenon difference brings to develop the present model, especially, at the onset of boiling. A few models had been developed for the sodium boiling analysis. The models such as those in the HOMSEP-2 and SAS series are classified into relatively detailed models. Both models are usually called a multiple-bubble slug ejection model. Some simpler models are also introduced to evade either parameter sensitivities or a mathematical complexity associated with those rigorous models. The present model based on the multiple-bubble slug ejection model. It allows a finite number (N) of bubbles, separated by liquid slugs, in a channel. Boiling occurs at a user specified superheat, and a generated vapor is modeled to fill the whole cross section of the coolant channel except for a static liquid film left on the cladding or/and structure surfaces. The model also assumes a vapor with one uniform pressure. The present analysis is focused on the behavior of early sodium boiling after ULOHS.

  13. Advanced accident sequence precursor analysis level 1 models

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  14. Development of two-dimensional hot pool model and analysis of the ULOHS accident in KALIMER design

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Jeong, K. S.; Hahn, H. D

    2000-10-01

    In the new version of HP2D program, the variation model of the hot pool sodium level is added so that the temperature and velocity profiles can be predicted more accurately than old version. To verify and validate the developed new version model, comparison of the MONJU experimental data with the predicted one is performed and analyzed. And also the ULOHS(Unprotected Loss of Heat Sink) accident in the KALIMER design is performed and analyzed.

  15. Global atmospheric dispersion modelling after the Fukushima accident

    Energy Technology Data Exchange (ETDEWEB)

    Suh, K.S.; Youm, M.K.; Lee, B.G.; Min, B.I. [Korea Atomic Energy Research Institute (Korea, Republic of); Raul, P. [Universidad de Sevilla (Spain)

    2014-07-01

    A large amount of radioactive material was released to the atmosphere due to the Fukushima nuclear accident in March 2011. The radioactive materials released into the atmosphere were mostly transported to the Pacific Ocean, but some of them were fallen on the surface due to dry and wet depositions in the northwest area from the Fukushima nuclear site. Therefore, northwest part of the nuclear site was seriously contaminated and it was designated with the restricted zone within a radius of 20 ∼ 30 km around the Fukushima nuclear site. In the early phase of the accident from 11 March to 30 March, the radioactive materials were dispersed to an area of the inland and offshore of the nuclear site by the variations of the wind. After the Fukushima accident, the radionuclides were detected through the air monitoring in the many places over the world. The radioactive plume was transported to the east part off the site by the westerly jet stream. It had detected in the North America during March 17-21, in European countries during March 23-24, and in Asia during from March 24 to April 6, 2011. The radioactive materials were overall detected across the northern hemisphere passed by 15 ∼ 20 days after the accident. Three dimensional numerical model was applied to evaluate the dispersion characteristics of the radionuclides released into the air. Simulated results were compared with measurements in many places over the world. Comparative results had good agreements in some places, but they had a little differences in some locations. The difference between the calculations and measurements are due to the meteorological data and relatively coarse resolutions in the model. Some radioactive materials were measured in Philippines, Taiwan, Hon Kong and South Korea during from March 23-28. It inferred that it was directly transported from the Fukushima by the northeastern monsoon winds. This event was well represented in the numerical model. Generally, the simulations had a good

  16. Risk forecasting and evaluating model of Environmental pollution accident

    Institute of Scientific and Technical Information of China (English)

    ZENG Wei-hua; CHENG Sheng-tong

    2005-01-01

    Environmental risk (ER) fact ore come from ER source and they are controlled by the primary control mechanism (PCM) of environmental risk, due to the self failures or the effects of external environment risk trigger mechanism, the PCM could not work regularly any more, then, the ER factore will release environmental space, and an ER field is formed up. The forming of ER field does not mean that any environmental pollution accident(EPA) will break out; only the ER receptore are exposed in the ER field and damaged seriously,the potential ER really turns into an actual EPA. Researching on the general laws of evolving from environmental risk to EPA, this paper bring forwards a relevant concept model of risk forecasting and evaluating of EPA. This model provides some scientific methods for risk evaluation, prevention and emergency response of EPA. This model not only enriches and develops the theory system of environment safety and emergency response, but also acts as an instruction for public safety, enterprise' s safety management and emergency response of the accident.

  17. Proposed Method for Estimating Traffic Accident Risk Factors Based on Object Tracking and Behavior Prediction Using Particle Filtering

    Science.gov (United States)

    Natori, Youichi; Kawamoto, Kazuhiko; Takahashi, Hiroshi; Hirota, Kaoru

    A traffic accident prediction method using a priori knowledge based on accident data is proposed for safe driving support. Implementation is achieved by an algorithm using particle filtering and fuzzy inference to estimate accident risk factors. With this method, the distance between the host vehicle and a vehicle ahead and their relative velocity and relative acceleration are obtained from the results of particle filtering of driving data and are used as attributes to build the relative driving state space. The attributes are evaluated as likelihoods and then consolidated as a risk level using fuzzy inference. Experimental validation was done using videos of general driving situations obtained with an on-vehicle CCD camera and one simulated accident situation created based on the video data. The results show that high risk levels were calculated with the proposed method in the early stages of the accident situations.

  18. ATMOSPHERIC MODELING IN SUPPORT OF A ROADWAY ACCIDENT

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, R.; Hunter, C.

    2010-10-21

    The United States Forest Service-Savannah River (USFS) routinely performs prescribed fires at the Savannah River Site (SRS), a Department of Energy (DOE) facility located in southwest South Carolina. This facility covers {approx}800 square kilometers and is mainly wooded except for scattered industrial areas containing facilities used in managing nuclear materials for national defense and waste processing. Prescribed fires of forest undergrowth are necessary to reduce the risk of inadvertent wild fires which have the potential to destroy large areas and threaten nuclear facility operations. This paper discusses meteorological observations and numerical model simulations from a period in early 2002 of an incident involving an early-morning multicar accident caused by poor visibility along a major roadway on the northern border of the SRS. At the time of the accident, it was not clear if the limited visibility was due solely to fog or whether smoke from a prescribed burn conducted the previous day just to the northwest of the crash site had contributed to the visibility. Through use of available meteorological information and detailed modeling, it was determined that the primary reason for the low visibility on this night was fog induced by meteorological conditions.

  19. Analysis of traffic accident size for Korean highway using structural equation models.

    Science.gov (United States)

    Lee, Ju-Yeon; Chung, Jin-Hyuk; Son, Bongsoo

    2008-11-01

    Accident size can be expressed as the number of involved vehicles, the number of damaged vehicles, the number of deaths and/or the number of injured. Accident size is the one of the important indices to measure the level of safety of transportation facilities. Factors such as road geometric condition, driver characteristic and vehicle type may be related to traffic accident size. However, all these factors interact in complicate ways so that the interrelationships among the variables are not easily identified. A structural equation model is adopted to capture the complex relationships among variables because the model can handle complex relationships among endogenous and exogenous variables simultaneously and furthermore it can include latent variables in the model. In this study, we use 2649 accident data occurred on highways in Korea and estimate relationship among exogenous factors and traffic accident size. The model suggests that road factors, driver factors and environment factors are strongly related to the accident size.

  20. Development and qualification of a thermal-hydraulic nodalization for modeling station blackout accident in PSB-VVER test facility

    Energy Technology Data Exchange (ETDEWEB)

    Saghafi, Mahdi [Department of Energy Engineering, Sharif University of Technology, Azadi Avenue, Tehran (Iran, Islamic Republic of); Ghofrani, Mohammad Bagher, E-mail: ghofrani@sharif.edu [Department of Energy Engineering, Sharif University of Technology, Azadi Avenue, Tehran (Iran, Islamic Republic of); D’Auria, Francesco [San Piero a Grado Nuclear Research Group (GRNSPG), University of Pisa, Via Livornese 1291, San Piero a Grado, Pisa (Italy)

    2016-07-15

    Highlights: • A thermal-hydraulic nodalization for PSB-VVER test facility has been developed. • Station blackout accident is modeled with the developed nodalization in MELCOR code. • The developed nodalization is qualified at both steady state and transient levels. • MELCOR predictions are qualitatively and quantitatively in acceptable range. • Fast Fourier Transform Base Method is used to quantify accuracy of code predictions. - Abstract: This paper deals with the development of a qualified thermal-hydraulic nodalization for modeling Station Black-Out (SBO) accident in PSB-VVER Integral Test Facility (ITF). This study has been performed in the framework of a research project, aiming to develop an appropriate accident management support tool for Bushehr nuclear power plant. In this regard, a nodalization has been developed for thermal-hydraulic modeling of the PSB-VVER ITF by MELCOR integrated code. The nodalization is qualitatively and quantitatively qualified at both steady-state and transient levels. The accuracy of the MELCOR predictions is quantified in the transient level using the Fast Fourier Transform Base Method (FFTBM). FFTBM provides an integral representation for quantification of the code accuracy in the frequency domain. It was observed that MELCOR predictions are qualitatively and quantitatively in the acceptable range. In addition, the influence of different nodalizations on MELCOR predictions was evaluated and quantified using FFTBM by developing 8 sensitivity cases with different numbers of control volumes and heat structures in the core region and steam generator U-tubes. The most appropriate case, which provided results with minimum deviations from the experimental data, was then considered as the qualified nodalization for analysis of SBO accident in the PSB-VVER ITF. This qualified nodalization can be used for modeling of VVER-1000 nuclear power plants when performing SBO accident analysis by MELCOR code.

  1. Development of a parametric containment event tree model for a severe BWR accident

    Energy Technology Data Exchange (ETDEWEB)

    Okkonen, T. [OTO-Consulting Ay, Helsinki (Finland)

    1995-04-01

    A containment event tree (CET) is built for analysis of severe accidents at the TVO boiling water reactor (BWR) units. Parametric models of severe accident progression and fission product behaviour are developed and integrated in order to construct a compact and self-contained Level 2 PSA model. The model can be easily updated to correspond to new research results. The analyses of the study are limited to severe accidents starting from full-power operation and leading to core melting, and are focused mainly on the use and effects of the dedicated severe accident management (SAM) systems. Severe accident progression from eight plant damage states (PDS), involving different pre-core-damage accident evolution, is examined, but the inclusion of their relative or absolute probabilities, by integration with Level 1, is deferred to integral safety assessments. (33 refs., 5 figs., 7 tabs.).

  2. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  3. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    Energy Technology Data Exchange (ETDEWEB)

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  4. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, J.J. [Oak Ridge National Lab., TN (United States)

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  5. Simulation on Poisson and negative binomial models of count road accident modeling

    Science.gov (United States)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  6. Development of MAAP5.0.3 Spent Fuel Pool Model for Severe Accident Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro [KHNP-CRI, Daejeon (Korea, Republic of)

    2015-10-15

    SFP model for OPR100 type NPP developed using MAAP 5.0.3. As expected, the accident progression in the SFP is very slow if the SFP integrity in maintained.

  7. Predictive models in urology.

    Science.gov (United States)

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology.

  8. VICTORIA: A mechanistic model of radionuclide behavior in the reactor coolant system under severe accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Heames, T.J. (Science Applications International Corp., Albuquerque, NM (USA)); Williams, D.A.; Johns, N.A.; Chown, N.M. (UKAEA Atomic Energy Establishment, Winfrith (UK)); Bixler, N.E.; Grimley, A.J. (Sandia National Labs., Albuquerque, NM (USA)); Wheatley, C.J. (UKAEA Safety and Reliability Directorate, Culcheth (UK))

    1990-10-01

    This document provides a description of a model of the radionuclide behavior in the reactor coolant system (RCS) of a light water reactor during a severe accident. This document serves as the user's manual for the computer code called VICTORIA, based upon the model. The VICTORIA code predicts fission product release from the fuel, chemical reactions between fission products and structural materials, vapor and aerosol behavior, and fission product decay heating. This document provides a detailed description of each part of the implementation of the model into VICTORIA, the numerical algorithms used, and the correlations and thermochemical data necessary for determining a solution. A description of the code structure, input and output, and a sample problem are provided. The VICTORIA code was developed upon a CRAY-XMP at Sandia National Laboratories in the USA and a CRAY-2 and various SUN workstations at the Winfrith Technology Centre in England. 60 refs.

  9. Modelling transport and deposition of caesium and iodine from the Chernobyl accident using the DREAM model

    Directory of Open Access Journals (Sweden)

    J. Brandt

    2002-06-01

    Full Text Available A tracer model, DREAM (the Danish Rimpuff and Eulerian Accidental release Model, has been developed for modelling transport, dispersion and deposition (wet and dry of radioactive material from accidental releases, as the Chernobyl accident. The model is a combination of a Lagrangian model, that includes the near source dispersion, and an Eulerian model describing the long-range transport. The performance of the transport model has previously been tested within the European Tracer Experiment, ETEX, which included transport and dispersion of an inert, non-depositing tracer from a controlled release. The focus of this paper is the model performance with respect to the deposition of 137Cs, 134Cs and 131I from the Chernobyl accident, using different relatively simple and comprehensive parameterizations. The performance, compared to measurements, of different combinations of parameterizations of wet and dry deposition schemes has been evaluated, using different statistical tests.

  10. Prediction of rate and severity of adverse perioperative outcomes: "normal accidents" revisited.

    Science.gov (United States)

    Saubermann, Albert J; Lagasse, Robert S

    2012-01-01

    The American Society of Anesthesiologists Physical Status classification system has been shown to predict the frequency of perioperative morbidity and mortality despite known subjectivity, inconsistent application, and exclusion of many perioperative confounding variables. The authors examined the relationship between the American Society of Anesthesiologists Physical Status and both the frequency and the severity of adverse events over a 10-year period in an academic anesthesiology practice. The American Society of Anesthesiologists Physical Status is predictive of not only the frequency of adverse perioperative events, but also the severity of adverse events. These nonlinear mathematical relationships can provide meaningful information on performance and risk. Calculated odds ratios allow discussion about individualized anesthesia risks based on the American Society of Anesthesiologists Physical Status because the added complexity of the surgical or diagnostic procedure, and other perioperative confounding variables, is indirectly factored into the Physical Status classification. The ability of the American Society of Anesthesiologists Physical Status to predict adverse outcome frequency and severity in a nonlinear relationship can be fully explained by applying the Normal Accident Theory, a well-known theory of system failure that relates the interactive complexity of system components to the frequency and the severity of system failures or adverse events.

  11. VICTORIA: A mechanistic model of radionuclide behavior in the reactor coolant system under severe accident conditions. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Heams, T J [Science Applications International Corp., Albuquerque, NM (United States); Williams, D A; Johns, N A; Mason, A [UKAEA, Winfrith, (England); Bixler, N E; Grimley, A J [Sandia National Labs., Albuquerque, NM (United States); Wheatley, C J [UKAEA, Culcheth (England); Dickson, L W [Atomic Energy of Canada Ltd., Chalk River, ON (Canada); Osborn-Lee, I [Oak Ridge National Lab., TN (United States); Domagala, P; Zawadzki, S; Rest, J [Argonne National Lab., IL (United States); Alexander, C A [Battelle, Columbus, OH (United States); Lee, R Y [Nuclear Regulatory Commission, Washington, DC (United States)

    1992-12-01

    The VICTORIA model of radionuclide behavior in the reactor coolant system (RCS) of a light water reactor during a severe accident is described. It has been developed by the USNRC to define the radionuclide phenomena and processes that must be considered in systems-level models used for integrated analyses of severe accident source terms. The VICTORIA code, based upon this model, predicts fission product release from the fuel, chemical reactions involving fission products, vapor and aerosol behavior, and fission product decay heating. Also included is a detailed description of how the model is implemented in VICTORIA, the numerical algorithms used, and the correlations and thermochemical data necessary for determining a solution. A description of the code structure, input and output, and a sample problem are provided.

  12. Development and application of a random walk model of atmospheric diffusion in the emergency response of nuclear accidents

    Institute of Scientific and Technical Information of China (English)

    CHI Bing; LI Hong; FANG Dong

    2007-01-01

    Plume concentration prediction is one of the main contents of radioactive consequence assessment for early emergency response to nuclear accidents. Random characteristics of atmospheric diffusion itself was described, a random walk model of atmospheric diffusion (Random Walk) was introduced and compared with the Lagrangian puff model (RIMPUFF) in the nuclear emergency decision support system (RODOS) developed by the European Community for verification. The results show the concentrations calculated by the two models are quite close except that the plume area calculated by Random Walk is a little smaller than that by RIMPUFF. The random walk model for atmospheric diffusion can simulate the atmospheric diffusion in case of nuclear accidents, and provide more actual information for early emergency and consequence assessment as one of the atmospheric diffusion module of the nuclear emergency decision support system.

  13. Speed Spatial Distribution Models for Traffic Accident Section of Freeway Based on Computer Simulation

    Institute of Scientific and Technical Information of China (English)

    Decai Li; Jiangwei Chu; Wenhui Zhang; Xiaojuan Wang; Guosheng Zhang

    2015-01-01

    Simulation models for accident section on freeway are built in microscopic traffic flow simulation environment. In these models involving 2⁃lane, 3⁃lane and 4⁃lane freeway, one detector is set every 10 m to measure section running speed. According to the simulation results, speed spatial distribution curves for traffic accident section on freeway are drawn which help to determine dangerous sections on upstream of accident section. Furthermore, the speed spatial distribution models are obtained for every speed distribution curve. The results provide theoretical basis for determination on temporal and spatial influence ranges of traffic accident and offer reference to formulation of speed limit scheme and other management measures.

  14. Nominal Model Predictive Control

    OpenAIRE

    Grüne, Lars

    2014-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  15. Nominal model predictive control

    OpenAIRE

    Grüne, Lars

    2013-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  16. Modeling the early-phase redistribution of radiocesium fallouts in an evergreen coniferous forest after Chernobyl and Fukushima accidents

    Energy Technology Data Exchange (ETDEWEB)

    Calmon, P.; Gonze, M.-A.; Mourlon, Ch.

    2015-10-01

    Following the Chernobyl accident, the scientific community gained numerous data on the transfer of radiocesium in European forest ecosystems, including information regarding the short-term redistribution of atmospheric fallout onto forest canopies. In the course of international programs, the French Institute for Radiological Protection and Nuclear Safety (IRSN) developed a forest model, named TREE4 (Transfer of Radionuclides and External Exposure in FORest systems), 15 years ago. Recently published papers on a Japanese evergreen coniferous forest contaminated by Fukushima radiocesium fallout provide interesting and quantitative data on radioactive mass fluxes measured within the forest in the months following the accident. The present study determined whether the approach adopted in the TREE4 model provides satisfactory results for Japanese forests or whether it requires adjustments. This study focused on the interception of airborne radiocesium by forest canopy, and the subsequent transfer to the forest floor through processes such as litterfall, throughfall, and stemflow, in the months following the accident. We demonstrated that TREE4 quite satisfactorily predicted the interception fraction (20%) and the canopy-to-soil transfer (70% of the total deposit in 5 months) in the Tochigi forest. This dynamics was similar to that observed in the Höglwald spruce forest. However, the unexpectedly high contribution of litterfall (31% in 5 months) in the Tochigi forest could not be reproduced in our simulations (2.5%). Possible reasons for this discrepancy are discussed; and sensitivity of the results to uncertainty in deposition conditions was analyzed. - Highlights: • Transfer of radiocesium atmospheric fallout in evergreen forests was modeled. • The model was tested using observations from Chernobyl and Fukushima accidents. • Model predictions of canopy interception and depuration agree with measurements. • Unexpectedly high contribution of litterfall for the

  17. A Quasi-Poisson Approach on Modeling Accident Hazard Index for Urban Road Segments

    Directory of Open Access Journals (Sweden)

    Lu Ma

    2014-01-01

    Full Text Available In light of the recently emphasized studies on risk evaluation of crashes, accident counts under specific transportation facilities are adopted to reflect the chance of crash occurrence. The current study introduces more comprehensive measure with the supplement information of accidental harmfulness into the expression of accident risks which are also named Accident Hazard Index (AHI in the following context. Before the statistical analysis, datasets from various sources are integrated under a GIS platform and the corresponding procedures are presented as an illustrated example for similar analysis. Then, a quasi-Poisson regression model is suggested for analyses and the results show that the model is appropriate for dealing with overdispersed count data and several key explanatory variables were found to have significant impact on the estimation of AHI. In addition, the effect of weight on different severity levels of accidents is examined and the selection of the weight is also discussed.

  18. A new CFD modeling method for flow blockage accident investigations

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Wenyuan, E-mail: fanwy@mail.ustc.edu.cn; Peng, Changhong, E-mail: pengch@ustc.edu.cn; Chen, Yangli, E-mail: chenyl@mail.ustc.edu.cn; Guo, Yun, E-mail: guoyun79@ustc.edu.cn

    2016-07-15

    Highlights: • Porous-jump treatment is applied to CFD simulation on flow blockages. • Porous-jump treatment predicts consistent results with direct CFD treatment. • Relap5 predicts abnormal flow rate profiles in MTR SFA blockage scenario. • Relap5 fails to simulate annular heat flux in blockage case of annular assembly. • Porous-jump treatment provides reasonable and generalized CFD results. - Abstract: Inlet flow blockages in both flat and annular plate-type fuel assemblies are simulated by (Computational Fluid Dynamics) CFD and system analysis methods, with blockage ratio ranging from 60 to 90%. For all the blockage scenarios, mass flow rate of the blocked channel drops dramatically as blockage ratio increases, while mass flow rates of non-blocked channels are almost steady. As a result of over-simplifications, the system code fails to capture details of mass flow rate profiles of non-blocked channels and power redistribution of fuel plates. In order to acquire generalized CFD results, a new blockage modeling method is developed by using the porous-jump condition. For comparisons, direct CFD simulations are conducted toward postulated blockages. For the porous-jump treatment, conservative flow and heat transfer conditions are predicted for the blocked channel, while consistent predictions are obtained for non-blocked channels. Besides, flow fields in the blocked channel, asymmetric power redistributions of fuel plates, and complex heat transfer phenomena in annular fuel assembly are obtained and discussed. The present study indicates that the porous-jump condition is a reasonable blockage modeling method, which predicts generalized CFD results for flow blockages.

  19. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  20. Modeling of the TMI-2 (Three Mile Island Unit-2) accident with MELPROG/TRAC and calculation results for Phases 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Motley, F.E.; Jenks, R.P.

    1988-01-01

    Work has been performed to develop a Three Mile Island Unit-2 (TMI-2) simulation model for MELPROG/TRAC capable of predicting the observed plant behavior that took place during the accident of March 1979. A description of the TMI-2 plant model is presented and calculation results through 174 min of the accident are discussed. Using the ICBC boundary conditions, the calculation predicts pressurizer draining and core recovering prior to fuel-rod damage. A parametric calculation (reduced makeup flow) is currently underway and is in better agreement with the observed plant behavior. Efforts are underway to resolve current discrepancies and proceed with an accurate simulation through Phases 3 and 4 of the accident (174-227 min and 227-300 min, respectively). 13 refs., 11 figs., 2 tabs.

  1. Restructuring of an Event Tree for a Loss of Coolant Accident in a PSA model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Ho-Gon; Han, Sang-Hoon; Park, Jin-Hee; Jang, Seong-Chul [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Conventional risk model using PSA (probabilistic Safety Assessment) for a NPP considers two types of accident initiators for internal events, LOCA (Loss of Coolant Accident) and transient event such as Loss of electric power, Loss of cooling, and so on. Traditionally, a LOCA is divided into three initiating event (IE) categories depending on the break size, small, medium, and large LOCA. In each IE group, safety functions or systems modeled in the accident sequences are considered to be applicable regardless of the break size. However, since the safety system or functions are not designed based on a break size, there exist lots of mismatch between safety system/function and an IE, which may make the risk model conservative or in some case optimistic. Present paper proposes new methodology for accident sequence analysis for LOCA. We suggest an integrated single ET construction for LOCA by incorporating a safety system/function and its applicable break spectrum into the ET. Integrated accident sequence analysis in terms of ET for LOCA was proposed in the present paper. Safety function/system can be properly assigned if its applicable range is given by break set point. Also, using simple Boolean algebra with the subset of the break spectrum, final accident sequences are expressed properly in terms of the Boolean multiplication, the occurrence frequency and the success/failure of safety system. The accident sequence results show that the accident sequence is described more detailed compared with the conventional results. Unfortunately, the quantitative results in terms of MCS (minimal Cut-Set) was not given because system fault tree was not constructed for this analysis and the break set points for all 7 point were not given as a specified numerical quantity. Further study may be needed to fix the break set point and to develop system fault tree.

  2. Quantifying the risk of extreme aviation accidents

    Science.gov (United States)

    Das, Kumer Pial; Dey, Asim Kumer

    2016-12-01

    Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.

  3. Accidents - Chernobyl accident; Accidents - accident de Tchernobyl

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This file is devoted to the Chernobyl accident. It is divided in four parts. The first part concerns the accident itself and its technical management. The second part is relative to the radiation doses and the different contaminations. The third part reports the sanitary effects, the determinists ones and the stochastic ones. The fourth and last part relates the consequences for the other European countries with the case of France. Through the different parts a point is tackled with the measures taken after the accident by the other countries to manage an accident, the cooperation between the different countries and the groups of research and studies about the reactors safety, and also with the international medical cooperation, specially for the children, everything in relation with the Chernobyl accident. (N.C.)

  4. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...

  5. Development of comprehensive accident models for two-lane rural highways using exposure, geometry, consistency and context variables.

    Science.gov (United States)

    Cafiso, Salvatore; Di Graziano, Alessandro; Di Silvestro, Giacomo; La Cava, Grazia; Persaud, Bhagwant

    2010-07-01

    In Europe, approximately 60% of road accident fatalities occur on two-lane rural roads. Thus, research to develop and enhance explanatory and predictive models for this road type continues to be of interest in mitigating these accidents. To this end, this paper describes a novel and extensive data collection and modeling effort to define accident models for two-lane road sections based on a unique combination of exposure, geometry, consistency and context variables directly related to the safety performance. The first part of the paper documents how these were identified for the segmentation of highways into homogeneous sections. Next, is a description of the extensive data collection effort that utilized differential cinematic GPS surveys to define the horizontal alignment variables, and road safety inspections (RSIs) to quantify the other road characteristics related to safety. The final part of the paper focuses on the calibration of models for estimating the expected number of accidents on homogeneous sections that can be characterized by constant values of the explanatory variables. Several candidate models were considered for calibration using the Generalized Linear Modeling (GLM) approach. After considering the statistical significance of the parameters related to exposure, geometry, consistency and context factors, and goodness of fit statistics, 19 models were ranked and three were selected as the recommended models. The first of the three is a base model, with length and traffic as the only predictor variables; since these variables are the only ones likely to be available network-wide, this base model can be used in an empirical Bayesian calculation to conduct network screening for ranking "sites with promise" of safety improvement. The other two models represent the best statistical fits with different combinations of significant variables related to exposure, geometry, consistency and context factors. These multiple variable models can be used, with

  6. Object-Oriented Bayesian Networks (OOBN) for Aviation Accident Modeling and Technology Portfolio Impact Assessment

    Science.gov (United States)

    Shih, Ann T.; Ancel, Ersin; Jones, Sharon M.

    2012-01-01

    The concern for reducing aviation safety risk is rising as the National Airspace System in the United States transforms to the Next Generation Air Transportation System (NextGen). The NASA Aviation Safety Program is committed to developing an effective aviation safety technology portfolio to meet the challenges of this transformation and to mitigate relevant safety risks. The paper focuses on the reasoning of selecting Object-Oriented Bayesian Networks (OOBN) as the technique and commercial software for the accident modeling and portfolio assessment. To illustrate the benefits of OOBN in a large and complex aviation accident model, the in-flight Loss-of-Control Accident Framework (LOCAF) constructed as an influence diagram is presented. An OOBN approach not only simplifies construction and maintenance of complex causal networks for the modelers, but also offers a well-organized hierarchical network that is easier for decision makers to exploit the model examining the effectiveness of risk mitigation strategies through technology insertions.

  7. Influence of main variables modifications on accident transient based on AP1000-like MELCOR model

    Science.gov (United States)

    Malicki, M.; Pieńkowski, L.

    2016-09-01

    Analysis of Severe Accidents (SA) is one of the most important parts of nuclear safety researches. MELCOR is a validated system code for severe accident analysis and as such it was used to obtain presented results. Analysed AP1000 model is based on publicly available data only. Sensitivity analysis was done for the main variables of primary reactor coolant system to find their influence on accident transient. This kind of analysis helps to find weak points of reactor design and the model itself. Performed analysis is a base for creation of Small Modular Reactor (SMR) generic model which will be the next step of the investigation aiming to estimate safety level of different reactors. Results clearly help to establish a range of boundary conditions for main the variables in future SMR model.

  8. Modeling the early-phase redistribution of radiocesium fallouts in an evergreen coniferous forest after Chernobyl and Fukushima accidents.

    Science.gov (United States)

    Calmon, P; Gonze, M-A; Mourlon, Ch

    2015-10-01

    Following the Chernobyl accident, the scientific community gained numerous data on the transfer of radiocesium in European forest ecosystems, including information regarding the short-term redistribution of atmospheric fallout onto forest canopies. In the course of international programs, the French Institute for Radiological Protection and Nuclear Safety (IRSN) developed a forest model, named TREE4 (Transfer of Radionuclides and External Exposure in FORest systems), 15 years ago. Recently published papers on a Japanese evergreen coniferous forest contaminated by Fukushima radiocesium fallout provide interesting and quantitative data on radioactive mass fluxes measured within the forest in the months following the accident. The present study determined whether the approach adopted in the TREE4 model provides satisfactory results for Japanese forests or whether it requires adjustments. This study focused on the interception of airborne radiocesium by forest canopy, and the subsequent transfer to the forest floor through processes such as litterfall, throughfall, and stemflow, in the months following the accident. We demonstrated that TREE4 quite satisfactorily predicted the interception fraction (20%) and the canopy-to-soil transfer (70% of the total deposit in 5 months) in the Tochigi forest. This dynamics was similar to that observed in the Höglwald spruce forest. However, the unexpectedly high contribution of litterfall (31% in 5 months) in the Tochigi forest could not be reproduced in our simulations (2.5%). Possible reasons for this discrepancy are discussed; and sensitivity of the results to uncertainty in deposition conditions was analyzed.

  9. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  10. Severe accident modeling of a PWR core with different cladding materials

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S. C. [Westinghouse Electric Company LLC, 5801 Bluff Road, Columbia, SC 29209 (United States); Henry, R. E.; Paik, C. Y. [Fauske and Associates, Inc., 16W070 83rd Street, Burr Ridge, IL 60527 (United States)

    2012-07-01

    The MAAP v.4 software has been used to model two severe accident scenarios in nuclear power reactors with three different materials as fuel cladding. The TMI-2 severe accident was modeled with Zircaloy-2 and SiC as clad material and a SBO accident in a Zion-like, 4-loop, Westinghouse PWR was modeled with Zircaloy-2, SiC, and 304 stainless steel as clad material. TMI-2 modeling results indicate that lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would result if SiC was substituted for Zircaloy-2 as cladding. SBO modeling results indicate that the calculated time to RCS rupture would increase by approximately 20 minutes if SiC was substituted for Zircaloy-2. Additionally, when an extended SBO accident (RCS creep rupture failure disabled) was modeled, significantly lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would be generated by substituting SiC for Zircaloy-2 or stainless steel cladding. Because the rate of SiC oxidation reaction with elevated temperature H{sub 2}O (g) was set to 0 for this work, these results should be considered preliminary. However, the benefits of SiC as a more accident tolerant clad material have been shown and additional investigation of SiC as an LWR core material are warranted, specifically investigations of the oxidation kinetics of SiC in H{sub 2}O (g) over the range of temperatures and pressures relevant to severe accidents in LWR 's. (authors)

  11. [Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].

    Science.gov (United States)

    de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena

    2014-12-01

    The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.

  12. Research and application on FTA model of chemical accident fuzzy system%化工事故模糊系统FTA模型的研究与应用

    Institute of Scientific and Technical Information of China (English)

    王陈玉书; 张园园; 张巨伟; 尚思思; 刘俊亨

    2013-01-01

    针对化工事故模糊系统,基于三角模糊数和事故树建立事故定量分析模型.运用该模型进行案例分析,确定储油罐体发生火灾爆炸事故的概率分布,指出概率的波动范围、平均事故概率,得出基本事件的模糊重要度,进行模糊重要度排序,明确该化工系统危险源分布的状态,指出导致事故发生的最危险路径,给出该危险路径发生的模糊概率,以上为企业管理人员制定安全措施提供重要的参考依据.该模型对于企业进行事故预测、事故原因分析、制定安全对策、风险投资分析具有重要的意义.%Accident quantitative analysis model for chemical accident fuzzy system based on triangular fuzzy number and fault tree was established. This model was applied to analyzing a case, determining the fire and explosion accident probability distribution of oil tank, pointing out that the fluctuation range of probability and the average accident probability, getting the fuzzy important degree of basic events, sorting fuzzy degree of importance, clearing the state of dangerous source distribution about the chemical system, and pointing out the most dangerous path that causes the accident. All of above provide important reference for corporate executives to develop safety measures. The model is important for accident prediction, analyzing the cause of the accident, developing safety countermeasures and analyzing risk investment.

  13. Radiological assessment by compartment model POSEIDON-R of radioactivity released in the ocean following Fukushima Daiichi accident

    Science.gov (United States)

    Bezhenar, Roman; Maderich, Vladimir; Heling, Rudie; Jung, Kyung Tae; Myoung, Jung-Goo

    2013-04-01

    The modified compartment model POSEIDON-R (Lepicard et al, 2004), was applied to the North-Western Pacific and adjacent seas. It is for the first time, that a compartment model was used in this region, where 25 Nuclear Power Plants (NPP) are operated. The aim of this study is to perform a radiological assessment of the releases of radioactivity due to the Fukushima Daiichi accident. The model predicts the dispersion of radioactivity in water column and in the sediments, and the transfer of radionuclides throughout the marine food web, and the subsequent doses to the population due to the consumption of fishery products. A generic predictive dynamical food-chain model is used instead of concentration factor (CF) approach. The radionuclide uptake model for fish has as central feature the accumulation of radionuclides in the target tissue. Three layer structure of the water column makes it possible to describe deep-water transport adequately. In total 175 boxes cover the Northwestern Pacific, the East China Sea, and the Yellow Sea and East/Japan Sea. Water fluxes between boxes were calculated by averaging three-dimensional currents obtained by hydrodynamic model ROMS over a 10-years period. Tidal mixing between boxes was parameterized. The model was validated on observation data on the Cs-137 in water for the period 1945-2004. The source terms from nuclear weapon tests are regional source term from the bomb tests on Atoll Enewetak and Atoll Bikini and global deposition from weapons tests. The correlation coefficient between predicted and observed concentrations of Cs-137 in the surface water is 0.925 and RMSE=1.43 Bq/m3. A local-scale coastal box was used according POSEIDON's methodology to describe local processes of activity transport, deposition and food web around the Fukushima Daiichi NPP. The source term to the ocean from the Fukushima accident includes a 10-days release of Cs-134 (5 PBq) and Cs-137 (4 PBq) directly into the ocean and 6 and 5 PBq of Cs-134 and

  14. Modelling transport and deposition of caesium and iodine from the Chernobyl accident using the DREAM model

    Directory of Open Access Journals (Sweden)

    J. Brandt

    2002-01-01

    Full Text Available A tracer model, DREAM (the Danish Rimpuff and Eulerian Accidental release Model, has been developed for modelling transport, dispersion and deposition (wet and dry of radioactive material from accidental releases, as the Chernobyl accident. The model is a combination of a Lagrangian model, that includes the near source dispersion, and an Eulerian model describing the long-range transport. The performance of the transport model has previously been tested within the European Tracer Experiment, ETEX, which included transport and dispersion of an inert, non-depositing tracer from a controlled release. The focus of this paper is the model performance with respect to the total deposition of  137Cs, 134Cs and 131I from the Chernobyl accident, using different relatively simple and comprehensive parameterizations for dry- and wet deposition. The performance, compared to measurements, of using different combinations of two different wet deposition parameterizations and three different parameterizations of dry deposition has been evaluated, using different statistical tests. The best model performance, compared to measurements, is obtained when parameterizing the total deposition combined of a simple method for dry deposition and a subgrid-scale averaging scheme for wet deposition based on relative humidities. The same major conclusion is obtained for all the three different radioactive isotopes and using two different deposition measurement databases. Large differences are seen in the results obtained by using the two different parameterizations of wet deposition based on precipitation rates and relative humidities, respectively. The parameterization based on subgrid-scale averaging is, in all cases, performing better than the parameterization based on precipitation rates. This indicates that the in-cloud scavenging process is more important than the below cloud scavenging process for the submicron particles and that the precipitation rates are

  15. A graph model for preventing railway accidents based on the maximal information coefficient

    Science.gov (United States)

    Shao, Fubo; Li, Keping

    2017-01-01

    A number of factors influences railway safety. It is an important work to identify important influencing factors and to build the relationship between railway accident and its influencing factors. The maximal information coefficient (MIC) is a good measure of dependence for two-variable relationships which can capture a wide range of associations. Employing MIC, a graph model is proposed for preventing railway accidents which avoids complex mathematical computation. In the graph, nodes denote influencing factors of railway accidents and edges represent dependence of the two linked factors. With the increasing of dependence level, the graph changes from a globally coupled graph to isolated points. Moreover, the important influencing factors are identified from many factors which are the monitor key. Then the relationship between railway accident and important influencing factors is obtained by employing the artificial neural networks. With the relationship, a warning mechanism is built by giving the dangerous zone. If the related factors fall into the dangerous zone in railway operations, the warning level should be raised. The built warning mechanism can prevent railway accidents and can promote railway safety.

  16. Using meteorological ensembles for atmospheric dispersion modelling of the Fukushima nuclear accident

    Science.gov (United States)

    Périllat, Raphaël; Korsakissok, Irène; Mallet, Vivien; Mathieu, Anne; Sekiyama, Thomas; Didier, Damien; Kajino, Mizuo; Igarashi, Yasuhito; Adachi, Kouji

    2016-04-01

    Dispersion models are used in response to an accidental release of radionuclides of the atmosphere, to infer mitigation actions, and complement field measurements for the assessment of short and long term environmental and sanitary impacts. However, the predictions of these models are subject to important uncertainties, especially due to input data, such as meteorological fields or source term. This is still the case more than four years after the Fukushima disaster (Korsakissok et al., 2012, Girard et al., 2014). In the framework of the SAKURA project, an MRI-IRSN collaboration, a meteorological ensemble of 20 members designed by MRI (Sekiyama et al. 2013) was used with IRSN's atmospheric dispersion models. Another ensemble, retrieved from ECMWF and comprising 50 members, was also used for comparison. The MRI ensemble is 3-hour assimilated, with a 3-kilometers resolution, designed to reduce the meteorological uncertainty in the Fukushima case. The ECMWF is a 24-hour forecast with a coarser grid, representative of the uncertainty of the data available in a crisis context. First, it was necessary to assess the quality of the ensembles for our purpose, to ensure that their spread was representative of the uncertainty of meteorological fields. Using meteorological observations allowed characterizing the ensembles' spread, with tools such as Talagrand diagrams. Then, the uncertainty was propagated through atmospheric dispersion models. The underlying question is whether the output spread is larger than the input spread, that is, whether small uncertainties in meteorological fields can produce large differences in atmospheric dispersion results. Here again, the use of field observations was crucial, in order to characterize the spread of the ensemble of atmospheric dispersion simulations. In the case of the Fukushima accident, gamma dose rates, air activities and deposition data were available. Based on these data, selection criteria for the ensemble members were

  17. MELCOR analysis of the TMI-2 accident

    Energy Technology Data Exchange (ETDEWEB)

    Boucheron, E.A.

    1990-01-01

    This paper describes the analysis of the Three Mile Island-2 (TMI-2) standard problem that was performed with MELCOR. The MELCOR computer code is being developed by Sandia National Laboratories for the Nuclear Regulatory Commission for the purpose of analyzing severe accident in nuclear power plants. The primary role of MELCOR is to provide realistic predictions of severe accident phenomena and the radiological source team. The analysis of the TMI-2 standard problem allowed for comparison of the model predictions in MELCOR to plant data and to the results of more mechanistic analyses. This exercise was, therefore valuable for verifying and assessing the models in the code. The major trends in the TMI-2 accident are reasonably well predicted with MELCOR, even with its simplified modeling. Comparison of the calculated and measured results is presented and, based on this comparison, conclusions can be drawn concerning the applicability of MELCOR to severe accident analysis. 5 refs., 10 figs., 3 tabs.

  18. Development of a fission product transport module predicting the behavior of radiological materials during sever accidents in a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyung Seok; Rhee, Bo Wook; Kim, Dong Ha [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-09-15

    Korea Atomic Energy Research Institute is developing a fission product transport module for predicting the behavior of radioactive materials in the primary cooling system of a nuclear power plant as a separate module, which will be connected to a severe accident analysis code, Core Meltdown Progression Accident Simulation Software (COMPASS). This fission product transport (COMPASS-FP) module consists of a fission product release model, an aerosol generation model, and an aerosol transport model. In the fission product release model there are three submodels based on empirical correlations, and they are used to simulate the fission product gases release from the reactor core. In the aerosol generation model, the mass conservation law and Raoult's law are applied to the mixture of vapors and droplets of the fission products in a specified control volume to find the generation of the aerosol droplet. In the aerosol transport model, empirical correlations available from the open literature are used to simulate the aerosol removal processes owing to the gravitational settling, inertia impaction, diffusiophoresis, and thermophoresis. The COMPASS-FP module was validated against Aerosol Behavior Code Validation and Evaluation (ABCOVE-5) test performed by Hanford Engineering Development Laboratory for comparing the prediction and test data. The comparison results assuming a non-spherical aerosol shape for the suspended aerosol mass concentration showed a good agreement with an error range of about ±6%. It was found that the COMPASS-FP module produced the reasonable results of the fission product gases release, the aerosol generation, and the gravitational settling in the aerosol removal processes for ABCOVE-5. However, more validation for other aerosol removal models needs to be performed.

  19. Development of a parametric containment event tree model of a severe PWR accident

    Energy Technology Data Exchange (ETDEWEB)

    Okkonen, T. [OTO-Consulting Ay, Helsinki (Finland)

    1996-06-01

    The study supports the development project of STUK on `Living` PSA Level 2. The main work objective is to develop review tools for the Level 2 PSA studies underway at the utilities. The SPSA (STUK PSA) code is specifically designed for the purpose. In this work, SPSA is utilized as the Level 2 programming and calculation tool. A containment event tree (CET) model is built for analysis of severe accidents at the Loviisa pressurized water reactor (PWR) units. Parametric models of severe accident progression and fission product behaviour are developed and integrated in order to construct a compact and self-contained Level 2 PSA model. The model can be easily updated to include new research results, and so it facilitates the Living PSA concept on Level 2 as well. The analyses of the study are limited to severe accidents starting from full-power operation and leading to core melting at a low primary system pressure. Severe accident progression from five plant damage states (PDSs) is examined, however the integration with Level 1 is deferred to more definitive, integrated, safety assessments. (34 refs., 5 figs., 9 tabs.).

  20. Predictive Models for Music

    OpenAIRE

    Paiement, Jean-François; Grandvalet, Yves; Bengio, Samy

    2008-01-01

    Modeling long-term dependencies in time series has proved very difficult to achieve with traditional machine learning methods. This problem occurs when considering music data. In this paper, we introduce generative models for melodies. We decompose melodic modeling into two subtasks. We first propose a rhythm model based on the distributions of distances between subsequences. Then, we define a generative model for melodies given chords and rhythms based on modeling sequences of Narmour featur...

  1. Input-output model for MACCS nuclear accident impacts estimation¹

    Energy Technology Data Exchange (ETDEWEB)

    Outkin, Alexander V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bixler, Nathan E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vargas, Vanessa N [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  2. Potential consequences in Norway after a hypothetical accident at Leningrad nuclear power plant. Potential release, fallout and predicted impacts on the environment

    Energy Technology Data Exchange (ETDEWEB)

    Nalbandyan, A.; Ytre-Eide, M.A.; Thoerring, H.; Liland, A.; Bartnicki, J.; Balonov, M.

    2012-06-15

    The report describes different hypothetical accident scenarios at the Leningrad nuclear power plant for both RBMK and VVER-1200 reactors. The estimated release is combined with different meteorological scenarios to predict possible fallout of radioactive substances in Norway. For a hypothetical catastrophic accident at an RBMK reactor combined with a meteorological worst case scenario, the consequences in Norway could be considerable. Foodstuffs in many regions would be contaminated above the food intervention levels for radioactive cesium in Norway. (Author)

  3. Modeling Zero – Inflated Regression of Road Accidents at Johor Federal Road F001

    Directory of Open Access Journals (Sweden)

    Prasetijo Joewono

    2016-01-01

    Full Text Available This study focused on the Poisson regression with excess zero outcomes on the response variable. A generalized linear modelling technique such as Poisson regression model and Negative Binomial model was found to be insignificant in explaining and handle over dispersion which due to high amount of zeros thus Zero Inflated model was introduced to overcome the problem. The application work on the number of road accidents on F001 Jalan Jb – Air Hitam. Data on road accident were collected for five-year period from 2010 through 2014. The result from analysis show that ZINB model performed best, in terms of the comparative criteria based on the P value less than 0.05.

  4. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    Directory of Open Access Journals (Sweden)

    X. Hu

    2014-01-01

    Full Text Available The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP accident (March 2011 are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  5. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg

    2001-01-01

    This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Dani...

  6. Modelling of HTR Confinement Behaviour during Accidents Involving Breach of the Helium Pressure Boundary

    Directory of Open Access Journals (Sweden)

    Joan Fontanet

    2009-01-01

    Full Text Available Development of HTRs requires the performance of a thorough safety study, which includes accident analyses. Confinement building performance is a key element of the system since the behaviour of aerosol and attached fission products within the building is of an utmost relevance in terms of the potential source term to the environment. This paper explores the available simulation capabilities (ASTEC and CONTAIN codes and illustrates the performance of a postulated HTR vented confinement under prototypical accident conditions by a scoping study based on two accident sequences characterized by Helium Pressure Boundary breaches, a small and a large break. The results obtained indicate that both codes predict very similar thermal-hydraulic responses of the confinement both in magnitude and timing. As for the aerosol behaviour, both codes predict that most of the inventory coming into the confinement is eventually depleted on the walls and only about 1% of the aerosol dust is released to the environment. The crosscomparison of codes states that largest differences are in the intercompartmental flows and the in-compartment gas composition.

  7. Fault-tree Models of Accident Scenarios of RoPax Vessels

    Institute of Scientific and Technical Information of China (English)

    Pedro Ant(a)o; C. Guedes Soares

    2006-01-01

    Ro-Ro vessels for cargo and passengers (RoPax) are a relatively new concept that has proven to be popular in the Mediterranean region and is becoming more widespread in Northern Europe. Due to its design characteristics and amount of passengers, although less than a regular passenger liner, accidents with RoPax vessels have far reaching consequences both for economical and for human life. The objective of this paper is to identify hazards related to casualties of RoPax vessels. The terminal casualty events chosen are related to accident and incident statistics for this type of vessel. This paper focuses on the identification of the basic events that can lead to an accident and the performance requirements. The hazard identification is carried out as the first step of a Formal Safety Assessment (FSA) and the modelling of the relation between the relevant events is made using Fault Tree Analysis (FTA). The conclusions of this study are recommendations to the later steps of FSA rather than for decision making (Step 5 of FSA). These recommendations will be focused on the possible design shortcomings identified during the analysis by fault trees throughout cut sets. Also the role that human factors have is analysed through a sensitivity analysis where it is shown that their influence is higher for groundings and collisions where an increase of the initial probability leads to the change of almost 90% of the accident occurrence.

  8. Preliminary modeling of the TMI-2 accident with MELPROG-TRAC

    Energy Technology Data Exchange (ETDEWEB)

    Jenks, R.P.

    1988-01-01

    In support of Nuclear Regulatory Commission and Organization for Economic Cooperation and Development (OECD)-sponsored Three Mile Island-Unit 2 (TMI-2) Analysis Exercise studies, work has been performed to develop a simulation model of the TMI-2 plant for use with the integrated MELPROG-TRAC computer code. Numerous nuclear power plant simulation studies have been performed with the TRAC computer code in the past. Some of these addressed the TMI-2 accident or other hypothetical events at the TMI plant. In addition, studies have been previously performed with the MELPROG-TRAC code using Oconee-1 and Surry plant models. This paper describes the preliminary MELPROG-TRAC input model for severe accident analysis.

  9. Development of an Ontology to Assist the Modeling of Accident Scenarii "Application on Railroad Transport "

    CERN Document Server

    Maalel, Ahmed; Mejri, Lassad; Ghezela, Henda Hajjami Ben

    2012-01-01

    In a world where communication and information sharing are at the heart of our business, the terminology needs are most pressing. It has become imperative to identify the terms used and defined in a consensual and coherent way while preserving linguistic diversity. To streamline and strengthen the process of acquisition, representation and exploitation of scenarii of train accidents, it is necessary to harmonize and standardize the terminology used by players in the security field. The research aims to significantly improve analytical activities and operations of the various safety studies, by tracking the error in system, hardware, software and human. This paper presents the contribution of ontology to modeling scenarii for rail accidents through a knowledge model based on a generic ontology and domain ontology. After a detailed presentation of the state of the art material, this article presents the first results of the developed model.

  10. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise P. [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison

  11. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    modelling strategy is applied to different training sets. For each modelling strategy we estimate a confidence score based on the same repeated bootstraps. A new decomposition of the expected Brier score is obtained, as well as the estimates of population average confidence scores. The latter can be used...... to distinguish rival prediction models with similar prediction performances. Furthermore, on the subject level a confidence score may provide useful supplementary information for new patients who want to base a medical decision on predicted risk. The ideas are illustrated and discussed using data from cancer...

  12. Modelling, controlling, predicting blackouts

    CERN Document Server

    Wang, Chengwei; Baptista, Murilo S

    2016-01-01

    The electric power system is one of the cornerstones of modern society. One of its most serious malfunctions is the blackout, a catastrophic event that may disrupt a substantial portion of the system, playing havoc to human life and causing great economic losses. Thus, understanding the mechanisms leading to blackouts and creating a reliable and resilient power grid has been a major issue, attracting the attention of scientists, engineers and stakeholders. In this paper, we study the blackout problem in power grids by considering a practical phase-oscillator model. This model allows one to simultaneously consider different types of power sources (e.g., traditional AC power plants and renewable power sources connected by DC/AC inverters) and different types of loads (e.g., consumers connected to distribution networks and consumers directly connected to power plants). We propose two new control strategies based on our model, one for traditional power grids, and another one for smart grids. The control strategie...

  13. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Chernobyl and Fukushima nuclear accidents: what has changed in the use of atmospheric dispersion modeling?

    Science.gov (United States)

    Benamrane, Y; Wybo, J-L; Armand, P

    2013-12-01

    The threat of a major accidental or deliberate event that would lead to hazardous materials emission in the atmosphere is a great cause of concern to societies. This is due to the potential large scale of casualties and damages that could result from the release of explosive, flammable or toxic gases from industrial plants or transport accidents, radioactive material from nuclear power plants (NPPs), and chemical, biological, radiological or nuclear (CBRN) terrorist attacks. In order to respond efficiently to such events, emergency services and authorities resort to appropriate planning and organizational patterns. This paper focuses on the use of atmospheric dispersion modeling (ADM) as a support tool for emergency planning and response, to assess the propagation of the hazardous cloud and thereby, take adequate counter measures. This paper intends to illustrate the noticeable evolution in the operational use of ADM tools over 25 y and especially in emergency situations. This study is based on data available in scientific publications and exemplified using the two most severe nuclear accidents: Chernobyl (1986) and Fukushima (2011). It appears that during the Chernobyl accident, ADM were used few days after the beginning of the accident mainly in a diagnosis approach trying to reconstruct what happened, whereas 25 y later, ADM was also used during the first days and weeks of the Fukushima accident to anticipate the potentially threatened areas. We argue that the recent developments in ADM tools play an increasing role in emergencies and crises management, by supporting stakeholders in anticipating, monitoring and assessing post-event damages. However, despite technological evolutions, its prognostic and diagnostic use in emergency situations still arise many issues.

  15. Phenomenological and mechanistic modeling of melt-structure-water interactions in a light water reactor severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Bui, V.A

    1998-10-01

    The objective of this work is to address the modeling of the thermal hydrodynamic phenomena and interactions occurring during the progression of reactor severe accidents. Integrated phenomenological models are developed to describe the accident scenarios, which consist of many processes, while mechanistic modeling, including direct numerical simulation, is carried out to describe separate effects and selected physical phenomena of particular importance 88 refs, 54 figs, 7 tabs

  16. Impact injury prediction by FE human body model

    Directory of Open Access Journals (Sweden)

    Hynčík L.

    2008-12-01

    Full Text Available The biomechanical simulations as powerful instruments are used in many areas such as traffic, medicine, sport, army etc. The simulations are often performed with models, which are based on the Finite Element Method. The great ability of FE deformable models of human bodies is to predict the injuries during accidents. Due to its modular implementation of thorax and abdomen FE models, human articulated rigid body model ROBBY, which was previously developed at the University of West Bohemia in cooperation with ESI Group (Engineering Simulation for Industry, can be used for this purpose. ROBBY model representing average adult man is still being improved to obtain more precise model of human body with the possibility to predict injuries during accidents. Recently, new generated thoracic model was embedded into ROBBY model and this was subsequently satisfactorily validated. In this study the updated ROBBY model was used and injury of head and thorax were investigated during frontal crashes simulated by virtue of two types of sled tests with various types of restraint system (shoulder belt, lap belt and airbag. The results of the simulation were compared with the experimental ones.

  17. A Novel Exercise Thermophysiology Comfort Prediction Model with Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Nan Jia

    2016-01-01

    Full Text Available Participation in a regular exercise program can improve health status and contribute to an increase in life expectancy. However, exercise accidents like dehydration, exertional heatstroke, syncope, and even sudden death exist. If these accidents can be analyzed or predicted before they happen, it will be beneficial to alleviate or avoid uncomfortable or unacceptable human disease. Therefore, an exercise thermophysiology comfort prediction model is needed. In this paper, coupling the thermal interactions among human body, clothing, and environment (HCE as well as the human body physiological properties, a human thermophysiology regulatory model is designed to enhance the human thermophysiology simulation in the HCE system. Some important thermal and physiological performances can be simulated. According to the simulation results, a human exercise thermophysiology comfort prediction method based on fuzzy inference system is proposed. The experiment results show that there is the same prediction trend between the experiment result and simulation result about thermophysiology comfort. At last, a mobile application platform for human exercise comfort prediction is designed and implemented.

  18. Modeling and Simulation of Release of Radiation in Flow Blockage Accident for Two Loops PWR

    OpenAIRE

    Khurram Mehboob; Cao Xinrong; Majid Ali

    2012-01-01

    In this study modeling and simulation of release of radiation form two loops PWR has been carried out for flow blockage accident. For this purpose, a MATLAB based program “Source Term Evaluator for Flow Blockage Accident” (STEFBA) has been developed, which uses the core inventory as its primary input. The TMI-2 reactor is considered as the reference plant for this study. For 1100 reactor operation days, the core inventory has been evaluated under the core design constrains at average reactor ...

  19. Initial VHTR accident scenario classification: models and data.

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Feldman, E. E.; Pointer, W. D.; Wei, T. Y. C.; Nuclear Engineering Division

    2005-09-30

    Nuclear systems codes are being prepared for use as computational tools for conducting performance/safety analyses of the Very High Temperature Reactor. The thermal-hydraulic codes are RELAP5/ATHENA for one-dimensional systems modeling and FLUENT and/or Star-CD for three-dimensional modeling. We describe a formal qualification framework, the development of Phenomena Identification and Ranking Tables (PIRTs), the initial filtering of the experiment databases, and a preliminary screening of these codes for use in the performance/safety analyses. In the second year of this project we focused on development of PIRTS. Two events that result in maximum fuel and vessel temperatures, the Pressurized Conduction Cooldown (PCC) event and the Depressurized Conduction Cooldown (DCC) event, were selected for PIRT generation. A third event that may result in significant thermal stresses, the Load Change event, is also selected for PIRT generation. Gas reactor design experience and engineering judgment were used to identify the important phenomena in the primary system for these events. Sensitivity calculations performed with the RELAP5 code were used as an aid to rank the phenomena in order of importance with respect to the approach of plant response to safety limits. The overall code qualification methodology was illustrated by focusing on the Reactor Cavity Cooling System (RCCS). The mixed convection mode of heat transfer and pressure drop is identified as an important phenomenon for Reactor Cavity Cooling System (RCCS) operation. Scaling studies showed that the mixed convection mode is likely to occur in the RCCS air duct during normal operation and during conduction cooldown events. The RELAP5/ATHENA code was found to not adequately treat the mixed convection regime. Readying the code will require adding models for the turbulent mixed convection regime while possibly performing new experiments for the laminar mixed convection regime. Candidate correlations for the turbulent

  20. Application of a predictive Bayesian model to environmental accounting.

    Science.gov (United States)

    Anex, R P; Englehardt, J D

    2001-03-30

    Environmental accounting techniques are intended to capture important environmental costs and benefits that are often overlooked in standard accounting practices. Environmental accounting methods themselves often ignore or inadequately represent large but highly uncertain environmental costs and costs conditioned by specific prior events. Use of a predictive Bayesian model is demonstrated for the assessment of such highly uncertain environmental and contingent costs. The predictive Bayesian approach presented generates probability distributions for the quantity of interest (rather than parameters thereof). A spreadsheet implementation of a previously proposed predictive Bayesian model, extended to represent contingent costs, is described and used to evaluate whether a firm should undertake an accelerated phase-out of its PCB containing transformers. Variability and uncertainty (due to lack of information) in transformer accident frequency and severity are assessed simultaneously using a combination of historical accident data, engineering model-based cost estimates, and subjective judgement. Model results are compared using several different risk measures. Use of the model for incorporation of environmental risk management into a company's overall risk management strategy is discussed.

  1. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    Energy Technology Data Exchange (ETDEWEB)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Hirohiko

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author).

  2. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    Science.gov (United States)

    Crider, Dennis; Foster, John V.

    2012-01-01

    In-flight loss of control remains the leading contributor to aviation accident fatalities, with stall upsets being the leading causal factor. The February 12, 2009. Colgan Air, Inc., Continental Express flight 3407 accident outside Buffalo, New York, brought this issue to the forefront of public consciousness and resulted in recommendations from the National Transportation Safety Board to conduct training that incorporates stalls that are fully developed and develop simulator standards to support such training. In 2010, Congress responded to this accident with Public Law 11-216 (Section 208), which mandates full stall training for Part 121 flight operations. Efforts are currently in progress to develop recommendations on implementation of stall training for airline pilots. The International Committee on Aviation Training in Extended Envelopes (ICATEE) is currently defining simulator fidelity standards that will be necessary for effective stall training. These recommendations will apply to all civil transport aircraft including straight-wing turboprop aircraft. Government-funded research over the previous decade provides a strong foundation for stall/post-stall simulation for swept-wing, conventional tail jets to respond to this mandate, but turboprops present additional and unique modeling challenges. First among these challenges is the effect of power, which can provide enhanced flow attachment behind the propellers. Furthermore, turboprops tend to operate for longer periods in an environment more susceptible to ice. As a result, there have been a significant number of turboprop accidents as a result of the early (lower angle of attack) stalls in icing. The vulnerability of turboprop configurations to icing has led to studies on ice accumulation and the resulting effects on flight behavior. Piloted simulations of these effects have highlighted the important training needs for recognition and mitigation of icing effects, including the reduction of stall margins

  3. Development of a Gravid Uterus Model for the Study of Road Accidents Involving Pregnant Women.

    Science.gov (United States)

    Auriault, F; Thollon, L; Behr, M

    2016-01-01

    Car accident simulations involving pregnant women are well documented in the literature and suggest that intra-uterine pressure could be responsible for the phenomenon of placental abruption, underlining the need for a realistic amniotic fluid model, including fluid-structure interactions (FSI). This study reports the development and validation of an amniotic fluid model using an Arbitrary Lagrangian Eulerian formulation in the LS-DYNA environment. Dedicated to the study of the mechanisms responsible for fetal injuries resulting from road accidents, the fluid model was validated using dynamic loading tests. Drop tests were performed on a deformable water-filled container at acceleration levels that would be experienced in a gravid uterus during a frontal car collision at 25 kph. During the test device braking phase, container deformation induced by inertial effects and FSI was recorded by kinematic analysis. These tests were then simulated in the LS-DYNA environment to validate a fluid model under dynamic loading, based on the container deformations. Finally, the coupling between the amniotic fluid model and an existing finite-element full-body pregnant woman model was validated in terms of pressure. To do so, experimental test results performed on four postmortem human surrogates (PMHS) (in which a physical gravid uterus model was inserted) were used. The experimental intra-uterine pressure from these tests was compared to intra uterine pressure from a numerical simulation performed under the same loading conditions. Both free fall numerical and experimental responses appear strongly correlated. The relationship between the amniotic fluid model and pregnant woman model provide intra-uterine pressure values correlated with the experimental test responses. The use of an Arbitrary Lagrangian Eulerian formulation allows the analysis of FSI between the amniotic fluid and the gravid uterus during a road accident involving pregnant women.

  4. Prediction models in complex terrain

    DEFF Research Database (Denmark)

    Marti, I.; Nielsen, Torben Skov; Madsen, Henrik

    2001-01-01

    The objective of the work is to investigatethe performance of HIRLAM in complex terrain when used as input to energy production forecasting models, and to develop a statistical model to adapt HIRLAM prediction to the wind farm. The features of the terrain, specially the topography, influence...

  5. Towards the prediction of the rupture of a pressurized water reactor vessel in case of accident; Vers la prevision du dechirement d'une cuve de reacteur a eau pressurisee en cas d'accident

    Energy Technology Data Exchange (ETDEWEB)

    Tardif, N.; Coret, M.; Combescure, A. [Lyon Univ., CNRS, INSA-Lyon, LaMCoS UMR5259, 69 (France); Tardif, N.; Nicaise, G. [Institut de Radioprotection et de Surete Nucleaire, DSR/SAGR/BPhAG, 92 - Fontenay-aux-Roses (France)

    2009-07-01

    Through a scale model of a reactor vessel submitted to a thermal and mechanical load during a severe accident, it is possible to follow the initiation and propagation of cracks in real time by tests carried out on laboratory. (O.M.)

  6. Hybrid Model for Early Onset Prediction of Driver Fatigue with Observable Cues

    Directory of Open Access Journals (Sweden)

    Mingheng Zhang

    2014-01-01

    Full Text Available This paper presents a hybrid model for early onset prediction of driver fatigue, which is the major reason of severe traffic accidents. The proposed method divides the prediction problem into three stages, that is, SVM-based model for predicting the early onset driver fatigue state, GA-based model for optimizing the parameters in the SVM, and PCA-based model for reducing the dimensionality of the complex features datasets. The model and algorithm are illustrated with driving experiment data and comparison results also show that the hybrid method can generally provide a better performance for driver fatigue state prediction.

  7. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    Science.gov (United States)

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed.

  8. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.P.; Chang, S.K.; Huang, H.C. [Nuclear Training Branch, Northeast Utilities, Waterford, CT (United States)

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  9. ASTEC V2 severe accident integral code main features, current V2.0 modelling status, perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Chatelard, P., E-mail: patrick.chatelard@irsn.fr [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES, B.250, Cadarache BP3 13115, Saint-Paul-lez-Durance, Cedex (France); Reinke, N.; Arndt, S. [Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH, Schwertnergasse 1, 50677 Köln (Germany); Belon, S.; Cantrel, L.; Carenini, L.; Chevalier-Jabet, K.; Cousin, F. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES, B.250, Cadarache BP3 13115, Saint-Paul-lez-Durance, Cedex (France); Eckel, J. [Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH, Schwertnergasse 1, 50677 Köln (Germany); Jacq, F.; Marchetto, C.; Mun, C.; Piar, L. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES, B.250, Cadarache BP3 13115, Saint-Paul-lez-Durance, Cedex (France)

    2014-06-01

    The severe accident integral code ASTEC, jointly developed since almost 20 years by IRSN and GRS, simulates the behaviour of a whole nuclear power plant under severe accident conditions, including severe accident management by engineering systems and procedures. Since 2004, the ASTEC code is progressively becoming the reference European severe accident integral code through in particular the intensification of research activities carried out in the frame of the SARNET European network of excellence. The first version of the new series ASTEC V2 was released in 2009 to about 30 organizations worldwide and in particular to SARNET partners. With respect to the previous V1 series, this new V2 series includes advanced core degradation models (issued from the ICARE2 IRSN mechanistic code) and necessary extensions to be applicable to Gen. III reactor designs, notably a description of the core catcher component to simulate severe accidents transients applied to the EPR reactor. Besides these two key-evolutions, most of the other physical modules have also been improved and ASTEC V2 is now coupled to the SUNSET statistical tool to make easier the uncertainty and sensitivity analyses. The ASTEC models are today at the state of the art (in particular fission product models with respect to source term evaluation), except for quenching of a severely damage core. Beyond the need to develop an adequate model for the reflooding of a degraded core, the main other mean-term objectives are to further progress on the on-going extension of the scope of application to BWR and CANDU reactors, to spent fuel pool accidents as well as to accidents in both the ITER Fusion facility and Gen. IV reactors (in priority on sodium-cooled fast reactors) while making ASTEC evolving towards a severe accident simulator constitutes the main long-term objective. This paper presents the status of the ASTEC V2 versions, focussing on the description of V2.0 models for water-cooled nuclear plants.

  10. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    Science.gov (United States)

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error."

  11. A model for the release, dispersion and environmental impact of a postulated reactor accident from a submerged commercial nuclear power plant

    Science.gov (United States)

    Bertch, Timothy Creston

    1998-12-01

    Nuclear power plants are inherently suitable for submerged applications and could provide power to the shore power grid or support future underwater applications. The technology exists today and the construction of a submerged commercial nuclear power plant may become desirable. A submerged reactor is safer to humans because the infinite supply of water for heat removal, particulate retention in the water column, sedimentation to the ocean floor and inherent shielding of the aquatic environment would significantly mitigate the effects of a reactor accident. A better understanding of reactor operation in this new environment is required to quantify the radioecological impact and to determine the suitability of this concept. The impact of release to the environment from a severe reactor accident is a new aspect of the field of marine radioecology. Current efforts have been centered on radioecological impacts of nuclear waste disposal, nuclear weapons testing fallout and shore nuclear plant discharges. This dissertation examines the environmental impact of a severe reactor accident in a submerged commercial nuclear power plant, modeling a postulated site on the Atlantic continental shelf adjacent to the United States. This effort models the effects of geography, decay, particle transport/dispersion, bioaccumulation and elimination with associated dose commitment. The use of a source term equivalent to the release from Chernobyl allows comparison between the impacts of that accident and the postulated submerged commercial reactor plant accident. All input parameters are evaluated using sensitivity analysis. The effect of the release on marine biota is determined. Study of the pathways to humans from gaseous radionuclides, consumption of contaminated marine biota and direct exposure as contaminated water reaches the shoreline is conducted. The model developed by this effort predicts a significant mitigation of the radioecological impact of the reactor accident release

  12. A model for the analysis of loss of decay heat removal during loss of coolant accident in MTR pool type research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Bousbia-salah, Anis [Dipartimento di Ingegneria Meccanica, Nucleari e della Produzione, Facolta di Ingegneria, Universita di Pisa, Via Diotisalvi, 2, 56126 Pisa (Italy)]. E-mail: b.salah@ing.unipi.it; Meftah, Brahim [Division Reacteur - Centre de Recherche Nucleaire Draria (CRND), BP 43 Sebala DRARIA - Algiers (Algeria); Hamidouche, Tewfik [Laboratoire des Analyses de Surete, Centre de Recherche Nucleaire d' Alger (CRNA), 02 Boulevard Frantz Fanon, B.P. 399, 16000 Algiers (Algeria)]. E-mail: thamidouche@comena-dz.org; Si-Ahmed, El Khider [Laboratoire des Ecoulements Polyhpasiques, Universite des Sciences et de la Technologie d' Alger, Algiers (Algeria)

    2006-03-15

    During a loss of coolant accident leading to total emptying of the reactor pool, the decay heat could be removed through air natural convection. However, under partial pool emptying the core is partially submerged and the coolant circulation inside the fuel element could no more be possible. Under such conditions, a core overheat takes place, and the thermal energy is essentially diffused from the core to its periphery by combined thermal radiation and conduction. In order to predict fuel element temperature evolution under such conditions a mathematical model is performed. The model is based on a 3D geometry and takes into account a variety of core configurations including fuel elements (standard and control), reflector elements and grid plates. The homogeneous flow model is used and the fluid conservation equations are solved using a semi-implicit finite difference method. Preliminary tests of the developed model were made by considering a series of hypothetical accidents. In the current framework a loss of decay heat removal accidents in the IAEA benchmark open pool MTR-type research reactor is considered. It is shown that in the case of a low core immersion height no water boiling is observed and the fuel surface temperature rise remains below the melting point of the aluminium cladding.

  13. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  14. Mesoscale modelling of radioactive contamination formation in Ukraine caused by the Chernobyl accident

    Energy Technology Data Exchange (ETDEWEB)

    Talerko, Nikolai [Scientific Center for Radiation Medicine, 53 Melnikov Street, Kyiv 04050 (Ukraine)]. E-mail: nick@rpi.kiev.ua

    2005-03-01

    This work is devoted to the reconstruction of time-dependent radioactive contamination fields in the territory of Ukraine in the initial period of the Chernobyl accident using the model of atmospheric transport LEDI (Lagrangian-Eulerian DIffusion model). The modelling results were compared with available {sup 137}Cs air and ground contamination measurement data. The {sup 137}Cs atmospheric transport over the territory of Ukraine was simulated during the first 12 days after the accident (from 26 April to 7 May 1986) using real aerological information and rain measurement network data. The detailed scenario of the release from the accidental unit of the Chernobyl nuclear plant has been built (including time-dependent radioactivity release intensity and time-varied height of the release). The calculations have enabled to explain the main features of spatial and temporal variations of radioactive contamination fields over the territory of Ukraine on the regional scale, including the formation of the major large-scale spots of radioactive contamination caused by dry and wet deposition.

  15. COMPARING SAFE VS. AT-RISK BEHAVIORAL DATA TO PREDICT ACCIDENTS

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe

    2001-11-01

    The Safety Observations Achieve Results (SOAR) program at the Idaho National Laboratory (INL) encourages employees to perform in-field observations of each other’s behaviors. One purpose for performing these observations is that it gives the observers the opportunity to correct, if needed, their co-worker’s at-risk work practices and habits (i.e., behaviors). The underlying premise of doing this is that major injuries (e.g., OSHA-recordable events) are prevented from occurring because the lower level at-risk behaviors are identified and corrected before they can propagate into culturally accepted unsafe behaviors that result in injuries or fatalities. However, unlike other observation programs, SOAR also emphasizes positive reinforcement for safe behaviors observed. The underlying premise of doing this is that positive reinforcement of safe behaviors helps establish a strong positive safety culture. Since the SOAR program collects both safe and at-risk leading indicator data, this provides a unique opportunity to assess and compare the two kinds of data in terms of their ability to predict future adverse safety events. This paper describes the results of analyses performed on SOAR data to assess their relative predictive ability. Implications are discussed.

  16. Low-power and shutdown models for the accident sequence precursor (ASP) program

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1997-02-01

    The US Nuclear Regulatory Commission (NRC) has been using full-power. Level 1, limited-scope risk models for the Accident Sequence Precursor (ASP) program for over fifteen years. These models have evolved and matured over the years, as have probabilistic risk assessment (PRA) and computer technologies. Significant upgrading activities have been undertaken over the past three years, with involvement from the Offices of Nuclear Reactor Regulation (NRR), Analysis and Evaluation of Operational Data (AEOD), and Nuclear Regulatory Research (RES), and several national laboratories. Part of these activities was an RES-sponsored feasibility study investigating the ability to extend the ASP models to include contributors to core damage from events initiated with the reactor at low power or shutdown (LP/SD), both internal events and external events. This paper presents only the LP/SD internal event modeling efforts.

  17. ASTEC V2 severe accident integral code: Fission product modelling and validation

    Energy Technology Data Exchange (ETDEWEB)

    Cantrel, L., E-mail: laurent.cantrel@irsn.fr; Cousin, F.; Bosland, L.; Chevalier-Jabet, K.; Marchetto, C.

    2014-06-01

    One main goal of the severe accident integral code ASTEC V2, jointly developed since almost more than 15 years by IRSN and GRS, is to simulate the overall behaviour of fission products (FP) in a damaged nuclear facility. ASTEC applications are source term determinations, level 2 Probabilistic Safety Assessment (PSA2) studies including the determination of uncertainties, accident management studies and physical analyses of FP experiments to improve the understanding of the phenomenology. ASTEC is a modular code and models of a part of the phenomenology are implemented in each module: the release of FPs and structural materials from degraded fuel in the ELSA module; the transport through the reactor coolant system approximated as a sequence of control volumes in the SOPHAEROS module; and the radiochemistry inside the containment nuclear building in the IODE module. Three other modules, CPA, ISODOP and DOSE, allow respectively computing the deposition rate of aerosols inside the containment, the activities of the isotopes as a function of time, and the gaseous dose rate which is needed to model radiochemistry in the gaseous phase. In ELSA, release models are semi-mechanistic and have been validated for a wide range of experimental data, and noticeably for VERCORS experiments. For SOPHAEROS, the models can be divided into two parts: vapour phase phenomena and aerosol phase phenomena. For IODE, iodine and ruthenium chemistry are modelled based on a semi-mechanistic approach, these FPs can form some volatile species and are particularly important in terms of potential radiological consequences. The models in these 3 modules are based on a wide experimental database, resulting for a large part from international programmes, and they are considered at the state of the art of the R and D knowledge. This paper illustrates some FPs modelling capabilities of ASTEC and computed values are compared to some experimental results, which are parts of the validation matrix.

  18. DEVELOPMENT OF A CRASH RISK PROBABILITY MODEL FOR FREEWAYS BASED ON HAZARD PREDICTION INDEX

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2014-12-01

    Full Text Available This study presents a method for the identification of hazardous situations on the freeways. The hazard identification is done using a crash risk probability model. For this study, about 18 km long section of Eastern Freeway in Melbourne (Australia is selected as a test bed. Two categories of data i.e. traffic and accident record data are used for the analysis and modelling. In developing the crash risk probability model, Hazard Prediction Index is formulated in this study by the differences of traffic parameters with threshold values. Seven different prediction indices are examined and the best one is selected as crash risk probability model based on prediction error minimisation.

  19. 基于灰色加权马尔可夫SCGM(1,1)c的交通事故预测%Traffic accident prediction based on gray weighted Markov SCGM (1,1)c

    Institute of Scientific and Technical Information of China (English)

    赵玲; 许宏科

    2012-01-01

    The prediction of traffic accident is the basis of transportation safety assessment, planning and decision-making. According to grey system theory and Markov chain principle, applying a single factor system cloud grey SCGM (1,1) c model to fit the tendency of the road traffic time series, its fitting index is random fluctuation. Markov chain method is suitable for forecasting stochastic fluctuating dynamic process, selecting weight Markov chain to predict the fitting index. Combining the advantages of two models, found a weighted SCGM( 1,1 )c model for road traffic accident frequency prediction, the new model is suitable for forecasting such kinds of system with short time, few data and not too large random fluctuation. Finally, the new model is applied to predict the traffic accident times of Beijing from 1975 to 2010. The results show that the new model not only discovers the trend of the traffic accident time but also overcomes the random fluctuation data of affecting precision accuracy, having a strong engineering applicability.%交通事故预测是交通安全评价、规划和决策的基础.基于灰色系统理论和马尔可夫链理论,应用系统石灰色模型SCGM(1,1)c拟合道路交通时序数据的总体趋势,所得拟合指标是随机波动的.马尔可夫链原理适合处理波动性大的系统过程,因此选用能更好解决随机波动性的加权马尔可夫链预测方法,提出一种用于道路交通事故次数预测的灰色加权马尔可夫SCGM(1,1)c模型,它适用于时间序列短,数据量少且随机波动不太大的动态过程预测.以某市1975-2010年道路交通事故次数为例进行了预测分析,结果表明该模型既能揭示交通事故次数变化的总体趋势,又能克服随机波动性数据对预测精度的影响,具有较强的工程实用性.

  20. Influence of the meteorological input on the atmospheric transport modelling with FLEXPART of radionuclides from the Fukushima Daiichi nuclear accident.

    Science.gov (United States)

    Arnold, D; Maurer, C; Wotawa, G; Draxler, R; Saito, K; Seibert, P

    2015-01-01

    In the present paper the role of precipitation as FLEXPART model input is investigated for one possible release scenario of the Fukushima Daiichi accident. Precipitation data from the European Center for Medium-Range Weather Forecast (ECMWF), the NOAA's National Center for Environmental Prediction (NCEP), the Japan Meteorological Agency's (JMA) mesoscale analysis and a JMA radar-rain gauge precipitation analysis product were utilized. The accident of Fukushima in March 2011 and the following observations enable us to assess the impact of these precipitation products at least for this single case. As expected the differences in the statistical scores are visible but not large. Increasing the ECMWF resolution of all the fields from 0.5° to 0.2° rises the correlation from 0.71 to 0.80 and an overall rank from 3.38 to 3.44. Substituting ECMWF precipitation, while the rest of the variables remains unmodified, by the JMA mesoscale precipitation analysis and the JMA radar gauge precipitation data yield the best results on a regional scale, specially when a new and more robust wet deposition scheme is introduced. The best results are obtained with a combination of ECMWF 0.2° data with precipitation from JMA mesoscale analyses and the modified wet deposition with a correlation of 0.83 and an overall rank of 3.58. NCEP-based results with the same source term are generally poorer, giving correlations around 0.66, and comparatively large negative biases and an overall rank of 3.05 that worsens when regional precipitation data is introduced.

  1. A Study of The Relationship Between The Components of The Five-Factor Model of Personality and The Occurrence of Occupational Accidents in Industry Workers

    Directory of Open Access Journals (Sweden)

    Ehsanollah Habibi

    2016-05-01

    Full Text Available Accidents are among the most important problems of both the developed and the developing countries. Individual factors and personality traits are the primary causes of human errors and contribute to accidents. The present study aims to investigate the relationship between the components of the five-factor model of personality and the occurrence of occupational accidents in industrial workers. The independent T-test indicated that there is a meaningful relationship between the personality traits and accident proneness. In the two groups of industry workers injured in occupational accidents and industry workers without any occupational accidents, there is a significant relationship between personality traits, neuroticism (p=0.001, openness to experience (p=0.001, extraversion (p=0.024 and conscientiousness (p=0.021. Nonetheless, concerning the personality trait of agreeableness (p = 0.09, the group of workers with accidents did not differ significantly from the workers without any accidents. The results showed that there is a direct and significant relationship between accident proneness and the personality traits of neuroticism and openness to experience. Furthermore, there is a meaningful but inverse correlation between accident proneness and the personality traits of extraversion and conscientiousness, while there was no relationship between accident proneness and the personality trait of agreeableness.

  2. A source term estimation method for a nuclear accident using atmospheric dispersion models

    DEFF Research Database (Denmark)

    Kim, Minsik; Ohba, Ryohji; Oura, Masamichi

    2015-01-01

    The objective of this study is to develop an operational source term estimation (STE) method applicable for a nuclear accident like the incident that occurred at the Fukushima Dai-ichi nuclear power station in 2011. The new STE method presented here is based on data from atmospheric dispersion...... models and short-range observational data around the nuclear power plants.The accuracy of this method is validated with data from a wind tunnel study that involved a tracer gas release from a scaled model experiment at Tokai Daini nuclear power station in Japan. We then use the methodology developed...... and validated through the effort described in this manuscript to estimate the release rate of radioactive material from the Fukushima Dai-ichi nuclear power station....

  3. Ruthenium release modelling in air under severe accident conditions using the MAAP4 code

    Energy Technology Data Exchange (ETDEWEB)

    Beuzet, E.; Lamy, J.S. [EDF R and D, 1 avenue du General de Gaulle, F-92140 Clamart (France); Perron, H. [EDF R and D, Avenue des Renardieres, Ecuelles, F-77818 Moret sur Loing (France); Simoni, E. [Institut de Physique Nucleaire, Universite de Paris Sud XI, F-91406 Orsay (France)

    2010-07-01

    In a nuclear power plant (NPP), in some situations of low probability of severe accidents, an air ingress into the vessel occurs. Air is a highly oxidizing atmosphere that can lead to an enhanced core degradation affecting the release of Fission Products (FPs) to the environment (source term). Indeed, Zircaloy-4 cladding oxidation by air yields 85% more heat than by steam. Besides, UO{sub 2} can be oxidised to UO{sub 2+x} and mixed with Zr, which may lead to a decrease of the fuel melting temperature. Finally, air atmosphere can enhance the FPs release, noticeably that of ruthenium. Ruthenium is of particular interest for two main reasons: first, its high radiotoxicity due to its short and long half-life isotopes ({sup 103}Ru and {sup 106}Ru respectively) and second, its ability to form highly volatile compounds such as ruthenium gaseous tetra-oxide (RuO{sub 4}). Considering that the oxygen affinity decreases between cladding, fuel and ruthenium inclusions, it is of great need to understand the phenomena governing fuel oxidation by air and ruthenium release as prerequisites for the source term issues. A review of existing data on ruthenium release, controlled by fuel oxidation, leads us to implement a new model in the EDF version of MAAP4 severe accident code (Modular Accident Analysis Program). This model takes into account the fuel stoichiometric deviation and the oxygen partial pressure evolution inside the fuel to simulate its oxidation by air. Ruthenium is then oxidised. Its oxides are released by volatilisation above the fuel. All the different ruthenium oxides formed and released are taken into consideration in the model, in terms of their particular reaction constants. In this way, partial pressures of ruthenium oxides are given in the atmosphere so that it is possible to know the fraction of ruthenium released in the atmosphere. This new model has been assessed against an analytical test of FPs release in air atmosphere performed at CEA (VERCORS RT8). The

  4. Validation of GAMMA+ model for Evaluating Heat Transfer of VHTR core in Accident Conditions by CFD analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dongho; Yoon, Sujong; Park, Gooncherl; Cho, Hyoungkyu [Seoul National Univ., Seoul (Korea, Republic of)

    2013-05-15

    KAERI has established a plan to demonstrate massive production of hydrogen using a VHTR by the early 2020s. In addition the GAMMA+ code is developed to analyze VHTR thermo-fluid transients at KAERI. One of the candidate reactor designs for VHTR is prismatic modular reactor (PMR), of which reference reactor is the 600MWth GT-MHR. This type of reactor has a passive safety system. During the High Pressure Conduction Cooling (HPCC) or Low Pressure Conduction Cooling (LPCC) accident, the core heats up by decay heat and then starts to cool down by conduction and radiation cooling to the Reactor Cavity Cooling System (RCCS) through the prismatic core. In this mechanism, the solid conduction occurs in graphite and fuel blocks, and the gas conduction and radiation occurs in coolant holes and bypass gaps. It is important to predict conduction and radiation heat transfer in the core for safety analysis. Effective thermal conductivity is derived by Maxwell's far-field methodology Radiation effect is expressed as corresponding conductivity and added to gas conductivity. In this study, ETC model used in GAMMA+ code is validated with the commercial CFD code, CFX-13. In this study, the effective thermal conductivity model of the GAMMA+ was evaluated by comparison of CFD analysis. The CFD analysis was conducted for various numbers and volume fractions of coolant holes and temperatures. Although slight disagreement was shown for the cases run with small number of holes, the result of GAMMA+ model is accurate for the large numbers of holes sufficiently. Since there are 102 coolant holes and 210 fuel holes in a fuel block, it is concluded that GAMMA+ model is proper formula for predicting effective thermal conductivity of the VHTR fuel block. However, in high temperature region above 500 .deg. C, the GAMMA+ model underestimates the effective thermal conductivity since radiation heat transfer is not reflected precisely. Further researches on it seem to be necessary.

  5. Regional long-term model of radioactivity dispersion and fate in the Northwestern Pacific and adjacent seas: application to the Fukushima Dai-ichi accident.

    Science.gov (United States)

    Maderich, V; Bezhenar, R; Heling, R; de With, G; Jung, K T; Myoung, J G; Cho, Y-K; Qiao, F; Robertson, L

    2014-05-01

    The compartment model POSEIDON-R was modified and applied to the Northwestern Pacific and adjacent seas to simulate the transport and fate of radioactivity in the period 1945-2010, and to perform a radiological assessment on the releases of radioactivity due to the Fukushima Dai-ichi accident for the period 2011-2040. The model predicts the dispersion of radioactivity in the water column and in sediments, the transfer of radionuclides throughout the marine food web, and subsequent doses to humans due to the consumption of marine products. A generic predictive dynamic food-chain model is used instead of the biological concentration factor (BCF) approach. The radionuclide uptake model for fish has as a central feature the accumulation of radionuclides in the target tissue. The three layer structure of the water column makes it possible to describe the vertical structure of radioactivity in deep waters. In total 175 compartments cover the Northwestern Pacific, the East China and Yellow Seas and the East/Japan Sea. The model was validated from (137)Cs data for the period 1945-2010. Calculated concentrations of (137)Cs in water, bottom sediments and marine organisms in the coastal compartment, before and after the accident, are in close agreement with measurements from the Japanese agencies. The agreement for water is achieved when an additional continuous flux of 3.6 TBq y(-1) is used for underground leakage of contaminated water from the Fukushima Dai-ichi NPP, during the three years following the accident. The dynamic food web model predicts that due to the delay of the transfer throughout the food web, the concentration of (137)Cs for piscivorous fishes returns to background level only in 2016. For the year 2011, the calculated individual dose rate for Fukushima Prefecture due to consumption of fishery products is 3.6 μSv y(-1). Following the Fukushima Dai-ichi accident the collective dose due to ingestion of marine products for Japan increased in 2011 by a

  6. PREDICT : model for prediction of survival in localized prostate cancer

    NARCIS (Netherlands)

    Kerkmeijer, Linda G W; Monninkhof, Evelyn M.; van Oort, Inge M.; van der Poel, Henk G.; de Meerleer, Gert; van Vulpen, Marco

    2016-01-01

    Purpose: Current models for prediction of prostate cancer-specific survival do not incorporate all present-day interventions. In the present study, a pre-treatment prediction model for patients with localized prostate cancer was developed.Methods: From 1989 to 2008, 3383 patients were treated with I

  7. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    Energy Technology Data Exchange (ETDEWEB)

    Evans, J.S. (Harvard Univ., Boston, MA (USA). School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  8. Predictive Modeling of Cardiac Ischemia

    Science.gov (United States)

    Anderson, Gary T.

    1996-01-01

    The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.

  9. Numerical weather prediction model tuning via ensemble prediction system

    Science.gov (United States)

    Jarvinen, H.; Laine, M.; Ollinaho, P.; Solonen, A.; Haario, H.

    2011-12-01

    This paper discusses a novel approach to tune predictive skill of numerical weather prediction (NWP) models. NWP models contain tunable parameters which appear in parameterizations schemes of sub-grid scale physical processes. Currently, numerical values of these parameters are specified manually. In a recent dual manuscript (QJRMS, revised) we developed a new concept and method for on-line estimation of the NWP model parameters. The EPPES ("Ensemble prediction and parameter estimation system") method requires only minimal changes to the existing operational ensemble prediction infra-structure and it seems very cost-effective because practically no new computations are introduced. The approach provides an algorithmic decision making tool for model parameter optimization in operational NWP. In EPPES, statistical inference about the NWP model tunable parameters is made by (i) generating each member of the ensemble of predictions using different model parameter values, drawn from a proposal distribution, and (ii) feeding-back the relative merits of the parameter values to the proposal distribution, based on evaluation of a suitable likelihood function against verifying observations. In the presentation, the method is first illustrated in low-order numerical tests using a stochastic version of the Lorenz-95 model which effectively emulates the principal features of ensemble prediction systems. The EPPES method correctly detects the unknown and wrongly specified parameters values, and leads to an improved forecast skill. Second, results with an atmospheric general circulation model based ensemble prediction system show that the NWP model tuning capacity of EPPES scales up to realistic models and ensemble prediction systems. Finally, a global top-end NWP model tuning exercise with preliminary results is published.

  10. Sensitivity study of the wet deposition schemes in the modelling of the Fukushima accident.

    Science.gov (United States)

    Quérel, Arnaud; Quélo, Denis; Roustan, Yelva; Mathieu, Anne; Kajino, Mizuo; Sekiyama, Thomas; Adachi, Kouji; Didier, Damien; Igarashi, Yasuhito

    2016-04-01

    The Fukushima-Daiichi release of radioactivity is a relevant event to study the atmospheric dispersion modelling of radionuclides. Actually, the atmospheric deposition onto the ground may be studied through the map of measured Cs-137 established consecutively to the accident. The limits of detection were low enough to make the measurements possible as far as 250km from the nuclear power plant. This large scale deposition has been modelled with the Eulerian model ldX. However, several weeks of emissions in multiple weather conditions make it a real challenge. Besides, these measurements are accumulated deposition of Cs-137 over the whole period and do not inform of deposition mechanisms involved: in-cloud, below-cloud, dry deposition. A comprehensive sensitivity analysis is performed in order to understand wet deposition mechanisms. It has been shown in a previous study (Quérel et al, 2016) that the choice of the wet deposition scheme has a strong impact on the assessment of the deposition patterns. Nevertheless, a "best" scheme could not be highlighted as it depends on the selected criteria: the ranking differs according to the statistical indicators considered (correlation, figure of merit in space and factor 2). A possibility to explain the difficulty to discriminate between several schemes was the uncertainties in the modelling, resulting from the meteorological data for instance. Since the move of the plume is not properly modelled, the deposition processes are applied with an inaccurate activity in the air. In the framework of the SAKURA project, an MRI-IRSN collaboration, new meteorological fields at higher resolution (Sekiyama et al., 2013) were provided and allows to reconsider the previous study. An updated study including these new meteorology data is presented. In addition, a focus on several releases causing deposition in located areas during known period was done. This helps to better understand the mechanisms of deposition involved following the

  11. Bicycle accidents.

    Science.gov (United States)

    Lind, M G; Wollin, S

    1986-01-01

    Information concerning 520 bicycle accidents and their victims was obtained from medical records and the victims' replies to questionnaires. The analyzed aspects included risk of injury, completeness of accident registrations by police and in hospitals, types of injuries and influence of the cyclists' age and sex, alcohol, fatigue, hunger, haste, physical disability, purpose of cycling, wearing of protective helmet and other clothing, type and quality of road surface, site of accident (road junctions, separate cycle paths, etc.) and turning manoeuvres.

  12. RAIM-A model for iodine behavior in containment under severe accident condition

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Han Chul; Cho, Yeong Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-12-15

    Following a severe accident in a nuclear power plant, iodine is a major contributor to the potential health risks for the public. Because the amount of iodine released largely depends on its volatility, iodine's behavior in containment has been extensively studied in international programs such as International Source Term Programme-Experimental Program on Iodine Chemistry under Radiation (EPICUR), Organization for Economic Co-operation and Development (OECD)-Behaviour of Iodine Project, and OECD-Source Term Evaluation and Mitigation. Korea Institute of Nuclear Safety (KINS) has joined these programs and is developing a simplified, stand-alone iodine chemistry model, RAIM (Radio-Active Iodine chemistry Model), based on the IMOD methodology and other previous studies. This model deals with chemical reactions associated with the formation and destruction of iodine species and surface reactions in the containment atmosphere and the sump in a simple manner. RAIM was applied to a simulation of four EPICUR tests and one Radioiodine Test Facility test, which were carried out in aqueous or gaseous phases. After analysis, the results show a trend of underestimation of organic and molecular iodine for the gas-phase experiments, the opposite of that for the aqueous-phase ones, whereas the total amount of volatile iodine species agrees well between the experiment and the analysis result.

  13. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...

  14. 灰色加权马氏链组合在航空装备事故预测中的应用%Grey Weighted Markov Chains Combination Method and Application in Aviation Equipment Accident Prediction

    Institute of Scientific and Technical Information of China (English)

    甘旭升; 李华平; 高建国

    2014-01-01

    为提高传统灰色马氏链组合模型的航空装备事故预测水平,引入加权马氏链,提出了一种改进的灰色马氏链组合预测方法。该方法先通过建立灰色模型,提取事故序列的趋势信息,然后再利用灰色残余信息构建加权马氏链模型,合理发挥各步长马氏链的作用,以期准确刻画随机波动规律。为验证其有效性,在美国空军A级飞行事故万时率实际数据基础上,建立了灰色加权马氏链组合预测模型,结果表明,模型对2000年~2002年预测的相对误差平均控制在4.59%以内,远高于灰色叠加马氏链模型,所建的模型能够比较客观地反映航空安全的未来发展现状。%To improve the prediction level of traditional Grey Markov chains model in aviation accident,an improved Grey Markov chains combination prediction method based on Weighted Markov chains is proposed. In the method,Grey model is firstly established to extract the trend information from accident data series,and then weighted Markov chains model for the residual information of Grey model is constructed to reasonably play the role of various step-size Markov chains for describing the stochastic fluctuation law. To prove its validity,on the basis of a class flight accident 10,000-hour-rate actual data of USAF,Grey weight Markov chains combination model is established. The result shows that the average relative error of prediction of model during 2000 to 2002 is 4.59%,well above that of Grey stacked Markov chains model. The built model can objectively reflect the future development tendency of aviation safety.

  15. Prediction of vascular cerebral accidents by PET T.D.M. with {sup 18}F-F.D.G; Prediction des accidents vasculaires cerebraux par la TEP -TDM vasculaire au 18F-FDG

    Energy Technology Data Exchange (ETDEWEB)

    Grandpierre, S.; Chevalier, O.; Thomas, V.; Netter, F.; Meneroux, B.; Karcher, G.; Marie, P.Y. [Service de medecine nucleaire, CHU de Nancy, (France); Desandes, E. [departement d' informatique medical, centre Alexis-Vautrin, Nancy, (France)

    2009-05-15

    This study is the first to show a relationship between the vascular captation of the F.D.G. in PET and the risk of a later ischemic cerebral vascular accident. this relation seems particularly strong for the sources of the carotids junction, so that the PET with F.D.G. could be useful to evaluate the stability of atheromas injuries in this area. (N.C.)

  16. Predictive Model Assessment for Count Data

    Science.gov (United States)

    2007-09-05

    critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts...the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. We consider a recent suggestion by Baker and...Figure 5. Boxplots for various scores for patent data count regressions. 11 Table 1 Four predictive models for larynx cancer counts in Germany, 1998–2002

  17. Review of experimental data for modelling LWR fuel cladding behaviour under loss of coolant accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Massih, Ali R. [Quantum Technologies AB, Uppsala Science Park (Sweden)

    2007-02-15

    Extensive range of experiments has been conducted in the past to quantitatively identify and understand the behaviour of fuel rod under loss-of-coolant accident (LOCA) conditions in light water reactors (LWRs). The obtained experimental data provide the basis for the current emergency core cooling system acceptance criteria under LOCA conditions for LWRs. The results of recent experiments indicate that the cladding alloy composition and high burnup effects influence LOCA acceptance criteria margins. In this report, we review some past important and recent experimental results. We first discuss the background to acceptance criteria for LOCA, namely, clad embrittlement phenomenology, clad embrittlement criteria (limitations on maximum clad oxidation and peak clad temperature) and the experimental bases for the criteria. Two broad kinds of test have been carried out under LOCA conditions: (i) Separate effect tests to study clad oxidation, clad deformation and rupture, and zirconium alloy allotropic phase transition during LOCA. (ii) Integral LOCA tests, in which the entire LOCA sequence is simulated on a single rod or a multi-rod array in a fuel bundle, in laboratory or in a tests and results are discussed and empirical correlations deduced from these tests and quantitative models are conferred. In particular, the impact of niobium in zirconium base clad and hydrogen content of the clad on allotropic phase transformation during LOCA and also the burst stress are discussed. We review some recent LOCA integral test results with emphasis on thermal shock tests. Finally, suggestions for modelling and further evaluation of certain experimental results are made.

  18. World Meteorological Organization's model simulations of the radionuclide dispersion and deposition from the Fukushima Daiichi nuclear power plant accident.

    Science.gov (United States)

    Draxler, Roland; Arnold, Dèlia; Chino, Masamichi; Galmarini, Stefano; Hort, Matthew; Jones, Andrew; Leadbetter, Susan; Malo, Alain; Maurer, Christian; Rolph, Glenn; Saito, Kazuo; Servranckx, René; Shimbori, Toshiki; Solazzo, Efisio; Wotawa, Gerhard

    2015-01-01

    Five different atmospheric transport and dispersion model's (ATDM) deposition and air concentration results for atmospheric releases from the Fukushima Daiichi nuclear power plant accident were evaluated over Japan using regional (137)Cs deposition measurements and (137)Cs and (131)I air concentration time series at one location about 110 km from the plant. Some of the ATDMs used the same and others different meteorological data consistent with their normal operating practices. There were four global meteorological analyses data sets available and two regional high-resolution analyses. Not all of the ATDMs were able to use all of the meteorological data combinations. The ATDMs were configured identically as much as possible with respect to the release duration, release height, concentration grid size, and averaging time. However, each ATDM retained its unique treatment of the vertical velocity field and the wet and dry deposition, one of the largest uncertainties in these calculations. There were 18 ATDM-meteorology combinations available for evaluation. The deposition results showed that even when using the same meteorological analysis, each ATDM can produce quite different deposition patterns. The better calculations in terms of both deposition and air concentration were associated with the smoother ATDM deposition patterns. The best model with respect to the deposition was not always the best model with respect to air concentrations. The use of high-resolution mesoscale analyses improved ATDM performance; however, high-resolution precipitation analyses did not improve ATDM predictions. Although some ATDMs could be identified as better performers for either deposition or air concentration calculations, overall, the ensemble mean of a subset of better performing members provided more consistent results for both types of calculations.

  19. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt

  20. Curve Estimation of Number of People Killed in Traffic Accidents in Turkey

    Science.gov (United States)

    Berkhan Akalin, Kadir; Karacasu, Murat; Altin, Arzu Yavuz; Ergül, Bariş

    2016-10-01

    One or more than one vehicle in motion on the highway involving death, injury and loss events which have resulted are called accidents. As a result of increasing population and traffic density, traffic accidents continue to increase and this leads to both human losses and harm to the economy. In addition, also leads to social problems. As a result of increasing population and traffic density, traffic accidents continue to increase and this leads to both human losses and harm to the economy. In addition to this, it also leads to social problems. As a result of traffic accidents, millions of people die year by year. A great majority of these accidents occur in developing countries. One of the most important tasks of transportation engineers is to reduce traffic accidents by creating a specific system. For that reason, statistical information about traffic accidents which occur in the past years should be organized by versed people. Factors affecting the traffic accidents are analyzed in various ways. In this study, modelling the number of people killed in traffic accidents in Turkey is determined. The dead people were modelled using curve fitting method with the number of people killed in traffic accidents in Turkey dataset between 1990 and 2014. It was also predicted the number of dead people by using various models for the future. It is decided that linear model is suitable for the estimates.

  1. EFFICIENT PREDICTIVE MODELLING FOR ARCHAEOLOGICAL RESEARCH

    OpenAIRE

    Balla, A.; Pavlogeorgatos, G.; Tsiafakis, D.; Pavlidis, G.

    2014-01-01

    The study presents a general methodology for designing, developing and implementing predictive modelling for identifying areas of archaeological interest. The methodology is based on documented archaeological data and geographical factors, geospatial analysis and predictive modelling, and has been applied to the identification of possible Macedonian tombs’ locations in Northern Greece. The model was tested extensively and the results were validated using a commonly used predictive gain,...

  2. Persistence of airline accidents.

    Science.gov (United States)

    Barros, Carlos Pestana; Faria, Joao Ricardo; Gil-Alana, Luis Alberiko

    2010-10-01

    This paper expands on air travel accident research by examining the relationship between air travel accidents and airline traffic or volume in the period from 1927-2006. The theoretical model is based on a representative airline company that aims to maximise its profits, and it utilises a fractional integration approach in order to determine whether there is a persistent pattern over time with respect to air accidents and air traffic. Furthermore, the paper analyses how airline accidents are related to traffic using a fractional cointegration approach. It finds that airline accidents are persistent and that a (non-stationary) fractional cointegration relationship exists between total airline accidents and airline passengers, airline miles and airline revenues, with shocks that affect the long-run equilibrium disappearing in the very long term. Moreover, this relation is negative, which might be due to the fact that air travel is becoming safer and there is greater competition in the airline industry. Policy implications are derived for countering accident events, based on competition and regulation.

  3. Inverse estimation of source parameters of oceanic radioactivity dispersion models associated with the Fukushima accident

    Directory of Open Access Journals (Sweden)

    Y. Miyazawa

    2013-04-01

    Full Text Available With combined use of the ocean–atmosphere simulation models and field observation data, we evaluate the parameters associated with the total caesium-137 amounts of the direct release into the ocean and atmospheric deposition over the western North Pacific caused by the accident of Fukushima Daiichi nuclear power plant (FNPP that occurred in March 2011. The Green's function approach is adopted for the estimation of two parameters determining the total emission amounts for the period from 12 March to 6 May 2011. It is confirmed that the validity of the estimation depends on the simulation skill near FNPP. The total amount of the direct release is estimated as 5.5–5.9 × 1015 Bq, while that of the atmospheric deposition is estimated as 5.5–9.7 × 1015 Bq, which indicates broader range of the estimate than that of the direct release owing to uncertainty of the dispersion widely spread over the western North Pacific.

  4. Inverse estimation of source parameters of oceanic radioactivity dispersion models associated with the Fukushima accident

    Directory of Open Access Journals (Sweden)

    Y. Miyazawa

    2012-10-01

    Full Text Available With combined use of the ocean-atmosphere simulation models and field observation data, we evaluate the parameters associated with the total caesium-137 amounts of the direct release into the ocean and atmospheric deposition over the Western North Pacific caused by the accident of Fukushima Daiichi nuclear power plant (FNPP that occurred in March 2011. The Green's function approach is adopted for the estimation of two parameters determining the total emission amounts for the period from 12 March to 6 May 2011. It is confirmed that the validity of the estimation depends on the simulation skill near FNPP. The total amount of the direct release is estimated as 5.5–5.9 × 1015 Bq, while that of the atmospheric deposition is estimated as 5.5–9.7 × 1015 Bq, which indicates broader range of the estimate than that of the direct release owing to uncertainty of the dispersion widely spread over the Western North Pacific.

  5. How to Establish Clinical Prediction Models

    Directory of Open Access Journals (Sweden)

    Yong-ho Lee

    2016-03-01

    Full Text Available A clinical prediction model can be applied to several challenging clinical scenarios: screening high-risk individuals for asymptomatic disease, predicting future events such as disease or death, and assisting medical decision-making and health education. Despite the impact of clinical prediction models on practice, prediction modeling is a complex process requiring careful statistical analyses and sound clinical judgement. Although there is no definite consensus on the best methodology for model development and validation, a few recommendations and checklists have been proposed. In this review, we summarize five steps for developing and validating a clinical prediction model: preparation for establishing clinical prediction models; dataset selection; handling variables; model generation; and model evaluation and validation. We also review several studies that detail methods for developing clinical prediction models with comparable examples from real practice. After model development and vigorous validation in relevant settings, possibly with evaluation of utility/usability and fine-tuning, good models can be ready for the use in practice. We anticipate that this framework will revitalize the use of predictive or prognostic research in endocrinology, leading to active applications in real clinical practice.

  6. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    is a realization of a continuous-discrete multivariate stochastic transfer function model. The proposed prediction error-methods are demonstrated for a SISO system parameterized by the transfer functions with time delays of a continuous-discrete-time linear stochastic system. The simulations for this case suggest......Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which...... computational resources. The identification method is suitable for predictive control....

  7. Case studies in archaeological predictive modelling

    NARCIS (Netherlands)

    Verhagen, Jacobus Wilhelmus Hermanus Philippus

    2007-01-01

    In this thesis, a collection of papers is put together dealing with various quantitative aspects of predictive modelling and archaeological prospection. Among the issues covered are the effects of survey bias on the archaeological data used for predictive modelling, and the complexities of testing p

  8. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  9. Models and numerical methods for the simulation of loss-of-coolant accidents in nuclear reactors

    Science.gov (United States)

    Seguin, Nicolas

    2014-05-01

    In view of the simulation of the water flows in pressurized water reactors (PWR), many models are available in the literature and their complexity deeply depends on the required accuracy, see for instance [1]. The loss-of-coolant accident (LOCA) may appear when a pipe is broken through. The coolant is composed by light water in its liquid form at very high temperature and pressure (around 300 °C and 155 bar), it then flashes and becomes instantaneously vapor in case of LOCA. A front of liquid/vapor phase transition appears in the pipes and may propagate towards the critical parts of the PWR. It is crucial to propose accurate models for the whole phenomenon, but also sufficiently robust to obtain relevant numerical results. Due to the application we have in mind, a complete description of the two-phase flow (with all the bubbles, droplets, interfaces…) is out of reach and irrelevant. We investigate averaged models, based on the use of void fractions for each phase, which represent the probability of presence of a phase at a given position and at a given time. The most accurate averaged model, based on the so-called Baer-Nunziato model, describes separately each phase by its own density, velocity and pressure. The two phases are coupled by non-conservative terms due to gradients of the void fractions and by source terms for mechanical relaxation, drag force and mass transfer. With appropriate closure laws, it has been proved [2] that this model complies with all the expected physical requirements: positivity of densities and temperatures, maximum principle for the void fraction, conservation of the mixture quantities, decrease of the global entropy… On the basis of this model, it is possible to derive simpler models, which can be used where the flow is still, see [3]. From the numerical point of view, we develop new Finite Volume schemes in [4], which also satisfy the requirements mentioned above. Since they are based on a partial linearization of the physical

  10. Childhood asthma prediction models: a systematic review.

    Science.gov (United States)

    Smit, Henriette A; Pinart, Mariona; Antó, Josep M; Keil, Thomas; Bousquet, Jean; Carlsen, Kai H; Moons, Karel G M; Hooft, Lotty; Carlsen, Karin C Lødrup

    2015-12-01

    Early identification of children at risk of developing asthma at school age is crucial, but the usefulness of childhood asthma prediction models in clinical practice is still unclear. We systematically reviewed all existing prediction models to identify preschool children with asthma-like symptoms at risk of developing asthma at school age. Studies were included if they developed a new prediction model or updated an existing model in children aged 4 years or younger with asthma-like symptoms, with assessment of asthma done between 6 and 12 years of age. 12 prediction models were identified in four types of cohorts of preschool children: those with health-care visits, those with parent-reported symptoms, those at high risk of asthma, or children in the general population. Four basic models included non-invasive, easy-to-obtain predictors only, notably family history, allergic disease comorbidities or precursors of asthma, and severity of early symptoms. Eight extended models included additional clinical tests, mostly specific IgE determination. Some models could better predict asthma development and other models could better rule out asthma development, but the predictive performance of no single model stood out in both aspects simultaneously. This finding suggests that there is a large proportion of preschool children with wheeze for which prediction of asthma development is difficult.

  11. A dynamic model to estimate the dose rate of marine biota (K-BIOTADYN- M) and its application to the Fukushima accident

    Energy Technology Data Exchange (ETDEWEB)

    Keum, Dong-Kwon; Jun, In; Kim, Byeong-Ho; Lim, Kwang-Muk; Choi, Yong-ho [Nuclear Environmental Safety Research Division, Korea Atomic Energy Research Institute, 989-111 Daedeodaero, Yuseong, Daejeon, 305-353 (Korea, Republic of)

    2014-07-01

    This paper describes a dynamic compartment model, K-BIOTA-DYN-M, to assess the activity concentration and dose rate of marine biota when the seawater activity varies with time, which is likely for the early phase after an accident. The model consists of seven compartments, phytoplankton, zooplankton, prey fish, benthic fish, crustacean, mollusk, and macro-algae. The phytoplankton compartment is assumed to be instantaneously in equilibrium with the seawater owing to the huge mass of the plankton in sea, and thus the activity of the phytoplankton is estimated using the equilibrium concentration ratio. The other compartments intake the radioactivity from both water and food, and lose the radioactivity by the biological elimination and radioactivity decay. Given the seawater activity, a set of ordinary differential equations representing the activity balance for biota is solved to obtain the time-variant activity concentration of biota, which is subsequently used to calculate the internal dose rate. The key parameters include the water intake rate, the daily feeding rate, the assimilation efficiency of radionuclides from food, the occupancy factor, and so on. The model has been applied to predict the activity concentration and dose rate of marine biota as a result the Fukushima nuclear accident on March 11, 2011. Using the seawater activities measured at three locations near the Fukushima NPPs, the time-variant activity concentration and dose rate during a few months after an accident for the seven model biota have been estimated. The preliminary results showed that the activity concentration of {sup 137}Cs in fish inhabiting the sea close to the Fukushima Daiichi NPP increased up to tenth-thousands of Bq/kg around the peak time of the seawater activity. This level is much higher than the food consumption restriction level for human protection; however, the estimated total dose rates (internal + external) for biota during the entire simulation time were all much less

  12. Computational fluid dynamics analysis of the initial stages of a VHTR air-ingress accident using a scaled-down model

    Energy Technology Data Exchange (ETDEWEB)

    Ham, Tae K., E-mail: taekyu8@gmail.com [Nuclear Engineering Program, The Ohio State University, Columbus, OH 43210 (United States); Arcilesi, David J., E-mail: arcilesi.1@osu.edu [Nuclear Engineering Program, The Ohio State University, Columbus, OH 43210 (United States); Kim, In H., E-mail: ihkim0730@gmail.com [Nuclear Engineering Program, The Ohio State University, Columbus, OH 43210 (United States); Sun, Xiaodong, E-mail: sun.200@osu.edu [Nuclear Engineering Program, The Ohio State University, Columbus, OH 43210 (United States); Christensen, Richard N., E-mail: rchristensen@uidaho.edu [Nuclear Engineering Program, The Ohio State University, Columbus, OH 43210 (United States); Oh, Chang H. [Idaho National Laboratory, Idaho Falls, ID 83402 (United States); Kim, Eung S., E-mail: kes7741@snu.ac.kr [Idaho National Laboratory, Idaho Falls, ID 83402 (United States)

    2016-04-15

    shows that flow reversal could occur due to Taylor wave expansion near the end of the depressurization, which could affect subsequent stages of the air ingress accident scenario. Therefore, to properly understand and evaluate the depressurization effects, numerical simulations are performed for the double-ended guillotine break of the Gas Turbine-Modular Helium Reactor (GT-MHR) cross vessel with a computational fluid dynamics (CFD) tool, ANSYS FLUENT. A benchmark and error quantification study of the depressurization shows that the ANSYS FLUENT model can predict the depressurization problem with relatively low uncertainty. In addition, the computational results show that the depressurization of a double-ended guillotine break behaves as an isentropic process. The observed flow oscillations near the end of the depressurization promote mixing of helium gas and air near the break. The results of the CFD analyses also show that the density-driven stratified flow, which is postulated to be the next stage of the air-ingress accident scenario, is strongly dependent on the density difference between the air–helium mixture in the containment and the helium in the reactor vessel. Therefore, the flow oscillations near the end of the depressurization stage may have a minor, yet notable, effect to slow down the air ingress due to density-driven stratified flow by decreasing the bulk density of the gas mixture in the containment through the addition of helium and increasing the bulk density in the reactor vessel through the addition of air.

  13. Sports Accidents

    CERN Multimedia

    Kiebel

    1972-01-01

    Le Docteur Kiebel, chirurgien à Genève, est aussi un grand ami de sport et de temps en temps médecin des classes genevoises de ski et également médecin de l'équipe de hockey sur glace de Genève Servette. Il est bien qualifié pour nous parler d'accidents de sport et surtout d'accidents de ski.

  14. Risk factors associated with bus accident severity in the United States: A generalized ordered logit model

    DEFF Research Database (Denmark)

    Kaplan, Sigal; Prato, Carlo Giacomo

    2012-01-01

    Introduction: Recent years have witnessed a growing interest in improving bus safety operations worldwide. While in the United States buses are considered relatively safe, the number of bus accidents is far from being negligible, triggering the introduction of the Motor-coach Enhanced Safety Act...... that accident severity increases: (i) for young bus drivers under the age of 25; (ii) for drivers beyond the age of 55, and most prominently for drivers over 65 years old; (iii) for female drivers; (iv) for very high (over 65 mph) and very low (under 20 mph) speed limits; (v) at intersections; (vi) because...

  15. A two-stage optimization model for emergency material reserve layout planning under uncertainty in response to environmental accidents.

    Science.gov (United States)

    Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Liu, Rentao; Wang, Peng

    2016-06-05

    In the emergency management relevant to pollution accidents, efficiency emergency rescues can be deeply influenced by a reasonable assignment of the available emergency materials to the related risk sources. In this study, a two-stage optimization framework is developed for emergency material reserve layout planning under uncertainty to identify material warehouse locations and emergency material reserve schemes in pre-accident phase coping with potential environmental accidents. This framework is based on an integration of Hierarchical clustering analysis - improved center of gravity (HCA-ICG) model and material warehouse location - emergency material allocation (MWL-EMA) model. First, decision alternatives are generated using HCA-ICG to identify newly-built emergency material warehouses for risk sources which cannot be satisfied by existing ones with a time-effective manner. Second, emergency material reserve planning is obtained using MWL-EMA to make emergency materials be prepared in advance with a cost-effective manner. The optimization framework is then applied to emergency management system planning in Jiangsu province, China. The results demonstrate that the developed framework not only could facilitate material warehouse selection but also effectively provide emergency material for emergency operations in a quick response.

  16. Predicting the long-term (137)Cs distribution in Fukushima after the Fukushima Dai-ichi nuclear power plant accident: a parameter sensitivity analysis.

    Science.gov (United States)

    Yamaguchi, Masaaki; Kitamura, Akihiro; Oda, Yoshihiro; Onishi, Yasuo

    2014-09-01

    than those of the other rivers. Annual sediment outflows from the Abukuma River and the total from the other 13 river basins were calculated as 3.2 × 10(4)-3.1 × 10(5) and 3.4 × 10(4)-2.1 × 10(5)ty(-1), respectively. The values vary between calculation cases because of the critical shear stress, the rainfall factor, and other differences. On the other hand, contributions of those parameters were relatively small for (137)Cs concentration within transported soil. This indicates that the total amount of (137)Cs outflow into the ocean would mainly be controlled by the amount of soil erosion and transport and the total amount of (137)Cs concentration remaining within the basin. Outflows of (137)Cs from the Abukuma River and the total from the other 13 river basins during the first year after the accident were calculated to be 2.3 × 10(11)-3.7 × 10(12) and 4.6 × 10(11)-6.5 × 10(12)Bqy(-1), respectively. The former results were compared with the field investigation results, and the order of magnitude was matched between the two, but the value of the investigation result was beyond the upper limit of model prediction.

  17. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  18. Development and test results of the Realtime Severe Accident Model 5 (RSAM5) based on the MAAP5 For the Kori 1 simulator

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Jin Hyuk; Lee, Myeong Soo [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    The Real Time Severe Accident Model (RSAM) in the Kori simulator employs the standard MAAP 5.01.1101 code (which is defined as MAAP 5.01) plus several statically linked libraries that interface with the simulator environment. The physical phenomena that can be envisioned inside the reactor vessel, the reactor coolant system (RCS), and the containment during severe accidents are comprehensively modeled by the MAAP5 code. The MAAP5 code has been known to be a reliable tool for understanding the sequence of events that occur during severe LWR accidents, evaluating the consequences of the failure of emergency systems, assessing the effects of operator interventions, and investigating the influence of design features of the RCS, containment, and safety systems on the accident consequences. The purpose of this paper is to describe the modeling of the Kori Unit 1 nuclear plant with the MAAP5 code and major outputs in the event of the SBO, SBO + SGTR, SBO + LBLOCA.

  19. Reactor physics modelling of accident tolerant fuel for LWRs using ANSWERS codes

    Directory of Open Access Journals (Sweden)

    Lindley Benjamin A.

    2016-01-01

    adopts an integral configuration and a fully passive decay heat removal system to provide indefinite cooling capability for a class of accidents. This paper presents the equilibrium cycle core design and reactor physics behaviour of the I2S-LWR with U3Si2 and the advanced steel cladding. The results were obtained using the traditional two-stage approach, in which homogenized macroscopic cross-section sets were generated by WIMS and applied in a full 3D core solution with PANTHER. The results obtained with WIMS/PANTHER were compared against the Monte Carlo Serpent code developed by VTT and previously reported results for the I2S-LWR. The results were found to be in a good agreement (e.g. <200 pcm in reactivity among the compared codes, giving confidence that the WIMS/PANTHER reactor physics package can be reliably used in modelling advanced LWR systems.

  20. A GIS-based prediction and assessment system of off-site accident consequence for Guangdong nuclear power plant.

    Science.gov (United States)

    Wang, X Y; Qu, J Y; Shi, Z Q; Ling, Y S

    2003-01-01

    GNARD (Guangdong Nuclear Accident Real-time Decision support system) is a decision support system for off-site emergency management in the event of an accidental release from the nuclear power plants located in Guangdong province, China. The system is capable of calculating wind field, concentrations of radionuclide in environmental media and radiation doses. It can also estimate the size of the area where protective actions should be taken and provide other information about population distribution and emergency facilities available in the area. Furthermore, the system can simulate and evaluate the effectiveness of countermeasures assumed and calculate averted doses by protective actions. All of the results can be shown and analysed on the platform of a geographical information system (GIS).

  1. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...

  2. Massive Predictive Modeling using Oracle R Enterprise

    CERN Document Server

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  3. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident.

    Science.gov (United States)

    Walsh, Linda; Zhang, Wei

    2016-03-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated "No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data". Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome "all solid cancer", it is shown here that sex modification is not statistically significant for the outcome "all solid cancer other than thyroid and breast cancer". It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  4. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Development of the severe accident risk information database management system SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  16. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    rate of 2000 kg/s. In most cases, however, the predicted energy deposition was much smaller, below the regulatory limits for fuel failure, but close or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady-state power following initial power excursion was in most cases about 20 % of the nominal reactor power, according to SIMULATE-3K and APROS. RECRIT predictions were in general different in this respect with either oscillating power or power increase approaching 50 % of nominal power which in both cases resulted in fuel temperatures above the melting point as a result of insufficient cooling. Long-term containment response to recriticality was assessed through MELCOR calculations for Olkiluoto 1 plant. At stabilised reactor power of 19 % of nominal power the containment failure due to overpressurization was predicted to occur 1.3 h after recriticality, if the accident is not mitigated. The SARA studies have clearly shown the sensitivity of recriticality phenomena to thermal-hydraulic modelling, the specifics of accident scenario, such as distribution of boron-carbide, and importance of multi-dimensional kinetics for determination of local power distribution in the core. The results of the project have pointed out the importance of adequate accident management procedures to be used by reactor operators and emergency staff during recovery actions. Recommendations in this area are given in the report.

  17. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  18. A Course in... Model Predictive Control.

    Science.gov (United States)

    Arkun, Yaman; And Others

    1988-01-01

    Describes a graduate engineering course which specializes in model predictive control. Lists course outline and scope. Discusses some specific topics and teaching methods. Suggests final projects for the students. (MVL)

  19. Development of Parameter Network for Accident Management Applications

    Energy Technology Data Exchange (ETDEWEB)

    Pak, Sukyoung; Ahemd, Rizwan; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Kim, Jung Taek; Park, Soo Yong; Ahn, Kwang Il [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    When a severe accident happens, it is hard to obtain the necessary information to understand of internal status because of the failure or damage of instrumentation and control systems. We learned the lessons from Fukushima accident that internal instrumentation system should be secured and must have ability to react in serious conditions. While there might be a number of methods to reinforce the integrity of instrumentation systems, we focused on the use of redundant behavior of plant parameters without additional hardware installation. Specifically, the objective of this study is to estimate the replaced value which is able to identify internal status by using set of available signals when it is impossible to use instrumentation information in a severe accident, which is the continuation of the paper which was submitted at the last KNS meeting. The concept of the VPN was suggested to improve the quality of parameters particularly to be logged during severe accidents in NPPs using a software based approach, and quantize the importance of each parameter for further maintenance. In the future, we will continue to perform the same analysis to other accident scenarios and extend the spectrum of initial conditions so that we are able to get more sets of VPNs and ANN models to predict the behavior of accident scenarios. The suggested method has the uncertainty underlain in the analysis code for severe accidents. However, In case of failure to the safety critical instrumentation, the information from the VPN would be available to carry out safety management operation.

  20. Equivalency and unbiasedness of grey prediction models

    Institute of Scientific and Technical Information of China (English)

    Bo Zeng; Chuan Li; Guo Chen; Xianjun Long

    2015-01-01

    In order to deeply research the structure discrepancy and modeling mechanism among different grey prediction mo-dels, the equivalence and unbiasedness of grey prediction mo-dels are analyzed and verified. The results show that al the grey prediction models that are strictly derived from x(0)(k) +az(1)(k) = b have the identical model structure and simulation precision. Moreover, the unbiased simulation for the homoge-neous exponential sequence can be accomplished. However, the models derived from dx(1)/dt+ax(1) =b are only close to those derived from x(0)(k)+az(1)(k)=b provided that|a|has to satisfy|a| < 0.1; neither could the unbiased simulation for the homoge-neous exponential sequence be achieved. The above conclusions are proved and verified through some theorems and examples.

  1. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children.

  2. Property predictions using microstructural modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wang, K.G. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States)]. E-mail: wangk2@rpi.edu; Guo, Z. [Sente Software Ltd., Surrey Technology Centre, 40 Occam Road, Guildford GU2 7YG (United Kingdom); Sha, W. [Metals Research Group, School of Civil Engineering, Architecture and Planning, The Queen' s University of Belfast, Belfast BT7 1NN (United Kingdom); Glicksman, M.E. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States); Rajan, K. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States)

    2005-07-15

    Precipitation hardening in an Fe-12Ni-6Mn maraging steel during overaging is quantified. First, applying our recent kinetic model of coarsening [Phys. Rev. E, 69 (2004) 061507], and incorporating the Ashby-Orowan relationship, we link quantifiable aspects of the microstructures of these steels to their mechanical properties, including especially the hardness. Specifically, hardness measurements allow calculation of the precipitate size as a function of time and temperature through the Ashby-Orowan relationship. Second, calculated precipitate sizes and thermodynamic data determined with Thermo-Calc[copyright] are used with our recent kinetic coarsening model to extract diffusion coefficients during overaging from hardness measurements. Finally, employing more accurate diffusion parameters, we determined the hardness of these alloys independently from theory, and found agreement with experimental hardness data. Diffusion coefficients determined during overaging of these steels are notably higher than those found during the aging - an observation suggesting that precipitate growth during aging and precipitate coarsening during overaging are not controlled by the same diffusion mechanism.

  3. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Evans, J.S. [Harvard School of Public Health, Boston, MA (United States); Abrahmson, S. [Wisconsin Univ., Madison, WI (United States); Bender, M.A. [Brookhaven National Lab., Upton, NY (United States); Boecker, B.B.; Scott, B.R. [Inhalation Toxicology Research Inst., Albuquerque, NM (United States); Gilbert, E.S. [Battelle Pacific Northwest Lab., Richland, WA (United States)

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  4. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  5. Status report of advanced cladding modeling work to assess cladding performance under accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    B.J. Merrill; Shannon M. Bragg-Sitton

    2013-09-01

    Scoping simulations performed using a severe accident code can be applied to investigate the influence of advanced materials on beyond design basis accident progression and to identify any existing code limitations. In 2012 an effort was initiated to develop a numerical capability for understanding the potential safety advantages that might be realized during severe accident conditions by replacing Zircaloy components in light water reactors (LWRs) with silicon carbide (SiC) components. To this end, a version of the MELCOR code, under development at the Sandia National Laboratories in New Mexico (SNL/NM), was modified by replacing Zircaloy for SiC in the MELCOR reactor core oxidation and material properties routines. The modified version of MELCOR was benchmarked against available experimental data to ensure that present SiC oxidation theory in air and steam were correctly implemented in the code. Additional modifications have been implemented in the code in 2013 to improve the specificity in defining components fabricated from non-standard materials. An overview of these modifications and the status of their implementation are summarized below.

  6. Precision Plate Plan View Pattern Predictive Model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yang; YANG Quan; HE An-rui; WANG Xiao-chen; ZHANG Yun

    2011-01-01

    According to the rolling features of plate mill, a 3D elastic-plastic FEM (finite element model) based on full restart method of ANSYS/LS-DYNA was established to study the inhomogeneous plastic deformation of multipass plate rolling. By analyzing the simulation results, the difference of head and tail ends predictive models was found and modified. According to the numerical simulation results of 120 different kinds of conditions, precision plate plan view pattern predictive model was established. Based on these models, the sizing MAS (mizushima automatic plan view pattern control system) method was designed and used on a 2 800 mm plate mill. Comparing the rolled plates with and without PVPP (plan view pattern predictive) model, the reduced width deviation indicates that the olate !olan view Dattern predictive model is preeise.

  7. Modeling and Prediction Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Juhl, Rune; Møller, Jan Kloppenborg; Jørgensen, John Bagterp

    2016-01-01

    Pharmacokinetic/pharmakodynamic (PK/PD) modeling for a single subject is most often performed using nonlinear models based on deterministic ordinary differential equations (ODEs), and the variation between subjects in a population of subjects is described using a population (mixed effects) setup...... that describes the variation between subjects. The ODE setup implies that the variation for a single subject is described by a single parameter (or vector), namely the variance (covariance) of the residuals. Furthermore the prediction of the states is given as the solution to the ODEs and hence assumed...... deterministic and can predict the future perfectly. A more realistic approach would be to allow for randomness in the model due to e.g., the model be too simple or errors in input. We describe a modeling and prediction setup which better reflects reality and suggests stochastic differential equations (SDEs...

  8. Analysis and prediction of inducement combination modes of maritime accidents induced by human errors%人因海事事故诱因组合模式分析与预测

    Institute of Scientific and Technical Information of China (English)

    张丽丽; 吕靖; 艾云飞

    2014-01-01

    为深入了解人因失误对海事事故的诱发机制,以事故历史数据为基础,对诱因组合模式进行分析和预测。在阐述“瑞士奶酪”模型和人的因素分析与分类系统(Human Factors Analysis and Classification System,HFACS)核心思想的基础上,构建人因海事事故诱因分类体系。将诱因量化为矩阵并通过矩阵转化和聚类分析等提取事故主要诱因组合模式,利用Bootstrap方法对主要诱因组合模式进行预测,结果有助于决策者制定针对性强、可操作性高的防范措施,可以从根本上提高海上运输安全性。%To better understand the mechanism of maritime accidents induced by human errors,the in-ducement combination modes are analyzed and predicted based on the history data. An inducement clas-sification system of maritime accidents induced by human errors is developed based on the core concept of the Swiss Cheese Model and Human Factors Analysis and Classification System (HFACS). The induce-ment factors are quantified in form of matrix. By matrix transform and clustering analysis,the main inducement combination modes are obtained. Then,the modes are predicted by Bootstrap method. The results may help decision-makers to implement targeted and maneuverable preventive measures,and improve the maritime transportation safety.

  9. Modelling Chemical Reasoning to Predict Reactions

    CERN Document Server

    Segler, Marwin H S

    2016-01-01

    The ability to reason beyond established knowledge allows Organic Chemists to solve synthetic problems and to invent novel transformations. Here, we propose a model which mimics chemical reasoning and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outperforms a rule-based expert system in the reaction prediction task for 180,000 randomly selected binary reactions. We show that our data-driven model generalises even beyond known reaction types, and is thus capable of effectively (re-) discovering novel transformations (even including transition-metal catalysed reactions). Our model enables computers to infer hypotheses about reactivity and reactions by only considering the intrinsic local structure of the graph, and because each single reaction prediction is typically ac...

  10. Cs-137 fallout in Iceland, model predictions and measurements

    Energy Technology Data Exchange (ETDEWEB)

    Palsson, S.E.; Sigurgeirsson, M.A.; Gudnason, K. [Icelandic Radiation Protection Inst. (Iceland); Arnalds, O.; Karlsdottir, I.A. [Agricultural Research Inst. (Iceland); Palsdottir, P. [Icelandic Meteorological Office (Iceland)

    2002-04-01

    Basically all the fallout Cs-137 in Iceland came from the atmospheric nuclear weapons tests in the late fifties and early sixties, the addition from the accident in the Chernobyl Nuclear Power Plant was relatively very small. Measurements of fallout from nuclear weapons tests started in Iceland over 40 years ago and samples of soil, vegetation and agricultural products have been collected from various places and measured during this period. Considerable variability has been seen in the results, even between places close to each other. This is understandable due to the mountainous terrain, changing strong winds and high levels of precipitation. This variability has been especially noticeable in the case of soil samples. The important role of uncultivated rangelands in Icelandic agriculture (e.g. for sheep farming) makes it necessary to estimate deposition for many remote areas. It has thus proven difficult to get a good overview of the distribution of the deposition and its subsequent transfer into agricultural products. Over a year ago an attempt was made to assess the distribution of Cs-137 fallout in Iceland. The approach is based on a model predicting deposition using precipitation data, in a similar manner as that used previously within the Arctic Monitoring and Assessment Programme (AMAP). 1999). One station close to Reykjavik has a time series of Cs-137 deposition data and precipitation data from 1960 onwards. The AMAP deposition model was calibrated for Iceland by using deposition and precipitation data from this station. (au)

  11. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  12. Genetic models of homosexuality: generating testable predictions

    OpenAIRE

    Gavrilets, Sergey; Rice, William R.

    2006-01-01

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality inclu...

  13. Use of detailed thermochemical databases to model chemical interactions in the Severe Accident codes

    Energy Technology Data Exchange (ETDEWEB)

    Barrachin, M. [IPSN/DRS, CEA Cadarache (France)

    2001-07-01

    For the prevention, mitigation and management of severe accidents, many problems related to core melt have to be solved: fuel degradation, melting and relocation, convection in the core melt(s), coolability of the core melt(s), fission product release, hydrogen production, behavior of the materials of the protective layers, ex-vessel spreading of the core melt(s).. To solve these problems such properties like thermal conductivity, heat capacity, density, viscosity, evaporation or sublimation of melts, the solidification behavior (solid/liquid fraction), the tendency to trap or to release the fission products, the stratification of melts notably metallic and oxide, must be known. However most of these properties are delicate to measure directly at high temperature and/or in the radio-active environment produced by the fission products. Therefore some of them must be derived by calculations from the physical-chemical description of the melt: number of phases, phase compositions, proportions of solids and liquids and their respective oxidation state, miscibility of the liquids, solubility of one phase in another, etc. This information is given by the phase diagrams of the materials in presence. Since more than ten years, IPSN has developed in collaboration with THERMODATA (Grenoble, France) a very detailed thermochemical database for the complex system U-O-Zr-Fe-Ni-La-Ba-Ru-Sr-Si-Mg-Ca-Al-(H-Ar). The direct coupling between the severe accident (SA) Codes and a thermochemical code with its database is not actually possible because of the computer time consuming and the size of the database. For this reason, most of the Severe Accident codes usually have a very simplified description for the phase diagrams which are not in agreement with the status of the art. In this presentation, alternative methodologies are detailed with their respective difficulties, the goal being to build an interface between a thermochemical database and a SA Code and to get a fast, accurate and

  14. Anthropometric dependence of the response of a thorax FE model under high speed loading: validation and real world accident replication.

    Science.gov (United States)

    Roth, Sébastien; Torres, Fabien; Feuerstein, Philippe; Thoral-Pierre, Karine

    2013-05-01

    Finite element analysis is frequently used in several fields such as automotive simulations or biomechanics. It helps researchers and engineers to understand the mechanical behaviour of complex structures. The development of computer science brought the possibility to develop realistic computational models which can behave like physical ones, avoiding the difficulties and costs of experimental tests. In the framework of biomechanics, lots of FE models have been developed in the last few decades, enabling the investigation of the behaviour of the human body submitted to heavy damage such as in road traffic accidents or in ballistic impact. In both cases, the thorax/abdomen/pelvis system is frequently injured. The understanding of the behaviour of this complex system is of extreme importance. In order to explore the dynamic response of this system to impact loading, a finite element model of the human thorax/abdomen/pelvis system has, therefore, been developed including the main organs: heart, lungs, kidneys, liver, spleen, the skeleton (with vertebrae, intervertebral discs, ribs), stomach, intestines, muscles, and skin. The FE model is based on a 3D reconstruction, which has been made from medical records of anonymous patients, who have had medical scans with no relation to the present study. Several scans have been analyzed, and specific attention has been paid to the anthropometry of the reconstructed model, which can be considered as a 50th percentile male model. The biometric parameters and laws have been implemented in the dynamic FE code (Radioss, Altair Hyperworks 11©) used for dynamic simulations. Then the 50th percentile model was validated against experimental data available in the literature, in terms of deflection, force, whose curve must be in experimental corridors. However, for other anthropometries (small male or large male models) question about the validation and results of numerical accident replications can be raised.

  15. Partial least square method for modelling ergonomic risks factors on express bus accidents in the east coast of peninsular west Malaysia

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Yusof bin [Faculty of Industrial Sciences and Technology, Universiti Malaysia Pahang, Gambang 26300 Kuantan, Pahang (Malaysia); Taha, Zahari bin [Faculty of Manufacturing Engineering, Malaysia Pahang, 26600 Pekan, Pahang (Malaysia)

    2015-02-03

    Public, stake holders and authorities in Malaysian government show great concern towards high numbers of passenger’s injuries and passengers fatalities in express bus accident. This paper studies the underlying factors involved in determining ergonomics risk factors towards human error as the reasons in express bus accidents in order to develop an integrated analytical framework. Reliable information about drivers towards bus accident should lead to the design of strategies intended to make the public feel safe in public transport services. In addition there is an analysis of ergonomics risk factors to determine highly ergonomic risk factors which led to accidents. The research was performed in east coast of peninsular Malaysia using variance-based structural equation modeling namely the Partial Least Squares (PLS) regression techniques. A questionnaire survey was carried out at random among 65 express bus drivers operating from the city of Kuantan in Pahang and among 49 express bus drivers operating from the city of Kuala Terengganu in Terengganu to all towns in the east coast of peninsular west Malaysia. The ergonomic risks factors questionnaire is based on demographic information, occupational information, organizational safety climate, ergonomic workplace, physiological factors, stress at workplace, physical fatigue and near miss accidents. The correlation and significant values between latent constructs (near miss accident) were analyzed using SEM SmartPLS, 3M. The finding shows that the correlated ergonomic risks factors (occupational information, t=2.04, stress at workplace, t = 2.81, physiological factor, t=2.08) are significant to physical fatigue and as the mediator to near miss accident at t = 2.14 at p<0.05and T-statistics, t>1.96. The results shows that the effects of physical fatigue due to ergonomic risks factors influence the human error as the reasons in express bus accidents.

  16. Partial least square method for modelling ergonomic risks factors on express bus accidents in the east coast of peninsular west Malaysia

    Science.gov (United States)

    Hashim, Yusof bin; Taha, Zahari bin

    2015-02-01

    Public, stake holders and authorities in Malaysian government show great concern towards high numbers of passenger's injuries and passengers fatalities in express bus accident. This paper studies the underlying factors involved in determining ergonomics risk factors towards human error as the reasons in express bus accidents in order to develop an integrated analytical framework. Reliable information about drivers towards bus accident should lead to the design of strategies intended to make the public feel safe in public transport services. In addition there is an analysis of ergonomics risk factors to determine highly ergonomic risk factors which led to accidents. The research was performed in east coast of peninsular Malaysia using variance-based structural equation modeling namely the Partial Least Squares (PLS) regression techniques. A questionnaire survey was carried out at random among 65 express bus drivers operating from the city of Kuantan in Pahang and among 49 express bus drivers operating from the city of Kuala Terengganu in Terengganu to all towns in the east coast of peninsular west Malaysia. The ergonomic risks factors questionnaire is based on demographic information, occupational information, organizational safety climate, ergonomic workplace, physiological factors, stress at workplace, physical fatigue and near miss accidents. The correlation and significant values between latent constructs (near miss accident) were analyzed using SEM SmartPLS, 3M. The finding shows that the correlated ergonomic risks factors (occupational information, t=2.04, stress at workplace, t = 2.81, physiological factor, t=2.08) are significant to physical fatigue and as the mediator to near miss accident at t = 2.14 at p1.96. The results shows that the effects of physical fatigue due to ergonomic risks factors influence the human error as the reasons in express bus accidents.

  17. Predictive model for segmented poly(urea

    Directory of Open Access Journals (Sweden)

    Frankl P.

    2012-08-01

    Full Text Available Segmented poly(urea has been shown to be of significant benefit in protecting vehicles from blast and impact and there have been several experimental studies to determine the mechanisms by which this protective function might occur. One suggested route is by mechanical activation of the glass transition. In order to enable design of protective structures using this material a constitutive model and equation of state are needed for numerical simulation hydrocodes. Determination of such a predictive model may also help elucidate the beneficial mechanisms that occur in polyurea during high rate loading. The tool deployed to do this has been Group Interaction Modelling (GIM – a mean field technique that has been shown to predict the mechanical and physical properties of polymers from their structure alone. The structure of polyurea has been used to characterise the parameters in the GIM scheme without recourse to experimental data and the equation of state and constitutive model predicts response over a wide range of temperatures and strain rates. The shock Hugoniot has been predicted and validated against existing data. Mechanical response in tensile tests has also been predicted and validated.

  18. Desktop Severe Accident Graphic Simulator Module for CANDU6 : PSAIS

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. Y.; Song, Y. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The ISAAC ((Integrated Severe Accident Analysis Code for CANDU Plant) code is a system level computer code capable of performing integral analyses of potential severe accident progressions in nuclear power plants, whose main purpose is to support a Level 2 probabilistic safety assessment or severe accident management strategy developments. The code has the capability to predict a severe accident progression by modeling the CANDU6- specific systems and the expected physical phenomena based on the current understanding of the unique accident progressions. The code models the sequence of accident progressions from a core heatup, pressure tube/calandria tube rupture after an uncovery from inside and outside, a relocation of the damaged fuel to the bottom of the calandria, debris behavior in the calandria, corium quenching after a debris relocation from the calandria to the calandria vault and an erosion of the calandria vault concrete floor, a hydrogen burn, and a reactor building failure. Along with the thermal hydraulics, the fission product behavior is also considered in the primary system as well as in the reactor building.

  19. Accident: Reminder

    CERN Multimedia

    2003-01-01

    There is no left turn to Point 1 from the customs, direction CERN. A terrible accident happened last week on the Route de Meyrin just outside Entrance B because traffic regulations were not respected. You are reminded that when travelling from the customs, direction CERN, turning left to Point 1 is forbidden. Access to Point 1 from the customs is only via entering CERN, going down to the roundabout and coming back up to the traffic lights at Entrance B

  20. PREDICTIVE CAPACITY OF ARCH FAMILY MODELS

    Directory of Open Access Journals (Sweden)

    Raphael Silveira Amaro

    2016-03-01

    Full Text Available In the last decades, a remarkable number of models, variants from the Autoregressive Conditional Heteroscedastic family, have been developed and empirically tested, making extremely complex the process of choosing a particular model. This research aim to compare the predictive capacity, using the Model Confidence Set procedure, than five conditional heteroskedasticity models, considering eight different statistical probability distributions. The financial series which were used refers to the log-return series of the Bovespa index and the Dow Jones Industrial Index in the period between 27 October 2008 and 30 December 2014. The empirical evidences showed that, in general, competing models have a great homogeneity to make predictions, either for a stock market of a developed country or for a stock market of a developing country. An equivalent result can be inferred for the statistical probability distributions that were used.

  1. Ship waste quantities prediction model for the port of Belgrade

    Directory of Open Access Journals (Sweden)

    VLADANKA PRESBURGER ULNIKOVIĆ

    2011-06-01

    Full Text Available This study focuses on the issues related to the waste management in river ports in general and especially in the port of Belgrade. Data on solid waste, waste oils, oily waters, gray water and black water have been collected for a period of five years. The methodology of data collection is presented. Trends of data were analyzed and the regression model was used to predict the waste quantities in the Belgrade port. This data could be utilized as a basis for the calculation of the equipment capacity for waste selective collection, treatment and storage. The results presented in this study establish the need for an orga¬nized management system for this type of waste which can be achieved either by constructing and providing new specialized terminal or by providing mobile floating facilities and other plants in the Port of Belgrade for these kinds of ser¬vices. In addition to the above, the legislative and organizational strategy of waste management has been explored to complete the study because the im¬pact of good waste management on environment and prevention of environ¬mental accidents would be highly beneficial. This study demonstrated that ad¬dressing these issues should be considered at international as well as national level.

  2. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-02-01

    Full Text Available Orientation: The article discussed the importance of rigour in credit risk assessment.Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan.Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities.Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems.Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk.Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product.Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  3. Calibrated predictions for multivariate competing risks models.

    Science.gov (United States)

    Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni

    2014-04-01

    Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.

  4. Modeling in fast dynamics of accidents in the primary circuit of PWR type reactors; Modelisation en dynamique rapide d'accidents dans le circuit primaire des reacteurs a eau pressurisee

    Energy Technology Data Exchange (ETDEWEB)

    Robbe, M.F

    2003-12-01

    Two kinds of accidents, liable to occur in the primary circuit of a Pressurized Water Reactor and involving fast dynamic phenomena, are analyzed. The Loss Of Coolant Accident (LOCA) is the accident used to define the current PWR. It consists in a large-size break located in a pipe of the primary circuit. A blowdown wave propagates through the circuit. The pressure differences between the different zones of the reactor induce high stresses in the structures of the lower head and may degrade the reactor core. The primary circuit starts emptying from the break opening. Pressure decreases very quickly, involving a large steaming. Two thermal-hydraulic simulations of the blowdown phase of a LOCA are computed with the Europlexus code. The primary circuit is represented by a pipe-model including the hydraulic peculiarities of the circuit. The main differences between both computations concern the kind of reactor, the break location and model, and the initialization of the accidental operation. Steam explosion is a hypothetical severe accident liable to happen after a core melting. The molten part of the core (called corium) falls in the lower part of the reactor. The interaction between the hot corium and the cold water remaining at the bottom of the vessel induces a massive and violent vaporization of water, similar to an explosive phenomenon. A shock wave propagates in the vessel. what can damage seriously the neighbouring structures or drill the vessel. This work presents a synthesis of in-vessel parametrical studies carried out with the Europlexus code, the linkage of the thermal-hydraulic code Mc3d dedicated to the pre-mixing phase with the Europlexus code dealing with the explosion, and finally a benchmark between the Cigalon and Europlexus codes relative to the Vulcano mock-up. (author)

  5. Modelling language evolution: Examples and predictions

    Science.gov (United States)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  6. Modelling language evolution: Examples and predictions.

    Science.gov (United States)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  7. Global Solar Dynamo Models: Simulations and Predictions

    Indian Academy of Sciences (India)

    Mausumi Dikpati; Peter A. Gilman

    2008-03-01

    Flux-transport type solar dynamos have achieved considerable success in correctly simulating many solar cycle features, and are now being used for prediction of solar cycle timing and amplitude.We first define flux-transport dynamos and demonstrate how they work. The essential added ingredient in this class of models is meridional circulation, which governs the dynamo period and also plays a crucial role in determining the Sun’s memory about its past magnetic fields.We show that flux-transport dynamo models can explain many key features of solar cycles. Then we show that a predictive tool can be built from this class of dynamo that can be used to predict mean solar cycle features by assimilating magnetic field data from previous cycles.

  8. Model Predictive Control of Sewer Networks

    Science.gov (United States)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik; Poulsen, Niels K.; Falk, Anne K. V.

    2017-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored and controlled have thus become essential factors for effcient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona benchmark model. Due to the inherent constraints the applied approach is based on Model Predictive Control.

  9. ModeI based on hidden Markov modeI for predicting terrorism accident%基于隐马尔可夫的恐怖事件预测模型

    Institute of Scientific and Technical Information of China (English)

    战兵; 韩锐

    2015-01-01

    To finish early-warning for terrorism incident,the network structure of terrorism incident was an-alyzed,and predicting model constructed.Hidden Markov Model and Bayesian networks were proposed to construct terrorism incident predicted model.Some possible terrorism activities were tracked by analyzing some previous events for predicting other terrorism activist in the future.By predicting these activities,in-formation could be obtained to forecast ahead of schedule,and prevent some possible terrorism accidents. Terrorism incident prediction algorithms were analyzed under perfect data and imperfect data circumstance. The experimental results show that the prediction method and monitoring software of the actual monito-ring results are very similar and validate the rationality of hidden Markov models and Bayesian network method.The shortage is less information acquisition in the process of monitoring,which to a certain de-gree also affects the accuracy of the results of the model.%为了对恐怖事件实现早期预警,通过分析恐怖事件的网络结构,构建了恐怖事件的预测模型。利用隐马尔可夫模型与贝叶斯网络方法,通过分析一些先前发生的事件来预测恐怖分子在未来一段时间可能发动的恐怖活动,实现对相关情报的侦测,预防可能发生的恐怖事件。同时,对完备数据与不完备数据条件下的恐怖事件的预测算法进行分析。结果表明,提出的预测方法与使用监测软件得到的结果相近,验证了隐马尔可夫模型的合理性和贝叶斯网络方法的有效性。不足之处在于,监测过程获取的情报信息较少,一定程度上影响了模型结果的精确度。

  10. DKIST Polarization Modeling and Performance Predictions

    Science.gov (United States)

    Harrington, David

    2016-05-01

    Calibrating the Mueller matrices of large aperture telescopes and associated coude instrumentation requires astronomical sources and several modeling assumptions to predict the behavior of the system polarization with field of view, altitude, azimuth and wavelength. The Daniel K Inouye Solar Telescope (DKIST) polarimetric instrumentation requires very high accuracy calibration of a complex coude path with an off-axis f/2 primary mirror, time dependent optical configurations and substantial field of view. Polarization predictions across a diversity of optical configurations, tracking scenarios, slit geometries and vendor coating formulations are critical to both construction and contined operations efforts. Recent daytime sky based polarization calibrations of the 4m AEOS telescope and HiVIS spectropolarimeter on Haleakala have provided system Mueller matrices over full telescope articulation for a 15-reflection coude system. AEOS and HiVIS are a DKIST analog with a many-fold coude optical feed and similar mirror coatings creating 100% polarization cross-talk with altitude, azimuth and wavelength. Polarization modeling predictions using Zemax have successfully matched the altitude-azimuth-wavelength dependence on HiVIS with the few percent amplitude limitations of several instrument artifacts. Polarization predictions for coude beam paths depend greatly on modeling the angle-of-incidence dependences in powered optics and the mirror coating formulations. A 6 month HiVIS daytime sky calibration plan has been analyzed for accuracy under a wide range of sky conditions and data analysis algorithms. Predictions of polarimetric performance for the DKIST first-light instrumentation suite have been created under a range of configurations. These new modeling tools and polarization predictions have substantial impact for the design, fabrication and calibration process in the presence of manufacturing issues, science use-case requirements and ultimate system calibration

  11. Raman Model Predicting Hardness of Covalent Crystals

    OpenAIRE

    Zhou, Xiang-Feng; Qian, Quang-Rui; Sun, Jian; Tian, Yongjun; Wang, Hui-Tian

    2009-01-01

    Based on the fact that both hardness and vibrational Raman spectrum depend on the intrinsic property of chemical bonds, we propose a new theoretical model for predicting hardness of a covalent crystal. The quantitative relationship between hardness and vibrational Raman frequencies deduced from the typical zincblende covalent crystals is validated to be also applicable for the complex multicomponent crystals. This model enables us to nondestructively and indirectly characterize the hardness o...

  12. Modelling Chemical Reasoning to Predict Reactions

    OpenAIRE

    Segler, Marwin H. S.; Waller, Mark P.

    2016-01-01

    The ability to reason beyond established knowledge allows Organic Chemists to solve synthetic problems and to invent novel transformations. Here, we propose a model which mimics chemical reasoning and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outpe...

  13. Predictive Modeling of the CDRA 4BMS

    Science.gov (United States)

    Coker, Robert; Knox, James

    2016-01-01

    Fully predictive models of the Four Bed Molecular Sieve of the Carbon Dioxide Removal Assembly on the International Space Station are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  14. Predictability of extreme values in geophysical models

    NARCIS (Netherlands)

    Sterk, A.E.; Holland, M.P.; Rabassa, P.; Broer, H.W.; Vitolo, R.

    2012-01-01

    Extreme value theory in deterministic systems is concerned with unlikely large (or small) values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical model

  15. A Predictive Model for MSSW Student Success

    Science.gov (United States)

    Napier, Angela Michele

    2011-01-01

    This study tested a hypothetical model for predicting both graduate GPA and graduation of University of Louisville Kent School of Social Work Master of Science in Social Work (MSSW) students entering the program during the 2001-2005 school years. The preexisting characteristics of demographics, academic preparedness and culture shock along with…

  16. Distributed Model Predictive Control via Dual Decomposition

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Stoustrup, Jakob; Andersen, Palle

    2014-01-01

    This chapter presents dual decomposition as a means to coordinate a number of subsystems coupled by state and input constraints. Each subsystem is equipped with a local model predictive controller while a centralized entity manages the subsystems via prices associated with the coupling constraints...

  17. Predictive Modelling of Mycotoxins in Cereals

    NARCIS (Netherlands)

    Fels, van der H.J.; Liu, C.

    2015-01-01

    In dit artikel worden de samenvattingen van de presentaties tijdens de 30e bijeenkomst van de Werkgroep Fusarium weergegeven. De onderwerpen zijn: Predictive Modelling of Mycotoxins in Cereals.; Microbial degradation of DON.; Exposure to green leaf volatiles primes wheat against FHB but boosts produ

  18. Unreachable Setpoints in Model Predictive Control

    DEFF Research Database (Denmark)

    Rawlings, James B.; Bonné, Dennis; Jørgensen, John Bagterp

    2008-01-01

    steady state is established for terminal constraint model predictive control (MPC). The region of attraction is the steerable set. Existing analysis methods for closed-loop properties of MPC are not applicable to this new formulation, and a new analysis method is developed. It is shown how to extend...

  19. Leptogenesis in minimal predictive seesaw models

    CERN Document Server

    Björkeroth, Fredrik; Varzielas, Ivo de Medeiros; King, Stephen F

    2015-01-01

    We estimate the Baryon Asymmetry of the Universe (BAU) arising from leptogenesis within a class of minimal predictive seesaw models involving two right-handed neutrinos and simple Yukawa structures with one texture zero. The two right-handed neutrinos are dominantly responsible for the "atmospheric" and "solar" neutrino masses with Yukawa couplings to $(\

  20. Reconstruction of {sup 131}I radioactive contamination in Ukraine caused by the Chernobyl accident using atmospheric transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Talerko, Nikolai [Scientific Center for Radiation Medicine, 53 Melnikov Street, Kyiv 04050 (Ukraine)]. E-mail: ntalerko@mail.ru

    2005-07-01

    The evaluation of {sup 131}I air and ground contamination field formation in the territory of Ukraine was made using the model of atmospheric transport LEDI (Lagrangian-Eulerian DIffusion model). The {sup 131}I atmospheric transport over the territory of Ukraine was simulated during the first 12 days after the accident (from 26 April to 7 May 1986) using real aerological information and rain measurement network data. The airborne {sup 131}I concentration and ground deposition fields were calculated as the database for subsequent thyroid dose reconstruction for inhabitants of radioactive contaminated regions. The small-scale deposition field variability is assessed using data of {sup 137}Cs detailed measurements in the territory of Ukraine. The obtained results are compared with available data of radioiodine daily deposition measurements made at the network of meteorological stations in Ukraine and data of the assessments of {sup 131}I soil contamination obtained from the {sup 129}I measurements.

  1. The SAM software system for modeling severe accidents at nuclear power plants equipped with VVER reactors on full-scale and analytic training simulators

    Science.gov (United States)

    Osadchaya, D. Yu.; Fuks, R. L.

    2014-04-01

    The architecture of the SAM software package intended for modeling beyond-design-basis accidents at nuclear power plants equipped with VVER reactors evolving into a severe stage with core melting and failure of the reactor pressure vessel is presented. By using the SAM software package it is possible to perform comprehensive modeling of the entire emergency process from the failure initiating event to the stage of severe accident involving meltdown of nuclear fuel, failure of the reactor pressure vessel, and escape of corium onto the concrete basement or into the corium catcher with retention of molten products in it.

  2. Disease prediction models and operational readiness.

    Directory of Open Access Journals (Sweden)

    Courtney D Corley

    Full Text Available The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011. We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4, spatial (26, ecological niche (28, diagnostic or clinical (6, spread or response (9, and reviews (3. The model parameters (e.g., etiology, climatic, spatial, cultural and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological were recorded and reviewed. A component of this review is the identification of verification and validation (V&V methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology

  3. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  4. Analysis of ex-vessel melt jet breakup and coolability. Part 1: Sensitivity on model parameters and accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Moriyama, Kiyofumi; Park, Hyun Sun, E-mail: hejsunny@postech.ac.kr; Hwang, Byoungcheol; Jung, Woo Hyun

    2016-06-15

    Highlights: • Application of JASMINE code to melt jet breakup and coolability in APR1400 condition. • Coolability indexes for quasi steady state breakup and cooling process. • Typical case in complete breakup/solidification, film boiling quench not reached. • Significant impact of water depth and melt jet size; weak impact of model parameters. - Abstract: The breakup of a melt jet falling in a water pool and the coolability of the melt particles produced by such jet breakup are important phenomena in terms of the mitigation of severe accident consequences in light water reactors, because the molten and relocated core material is the primary heat source that governs the accident progression. We applied a modified version of the fuel–coolant interaction simulation code, JASMINE, developed at Japan Atomic Energy Agency (JAEA) to a plant scale simulation of melt jet breakup and cooling assuming an ex-vessel condition in the APR1400, a Korean advanced pressurized water reactor. Also, we examined the sensitivity on seven model parameters and five initial/boundary condition variables. The results showed that the melt cooling performance of a 6 m deep water pool in the reactor cavity is enough for removing the initial melt enthalpy for solidification, for a melt jet of 0.2 m initial diameter. The impacts of the model parameters were relatively weak and that of some of the initial/boundary condition variables, namely the water depth and melt jet diameter, were very strong. The present model indicated that a significant fraction of the melt jet is not broken up and forms a continuous melt pool on the containment floor in cases with a large melt jet diameter, 0.5 m, or a shallow water pool depth, ≤3 m.

  5. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...

  6. Gas explosion prediction using CFD models

    Energy Technology Data Exchange (ETDEWEB)

    Niemann-Delius, C.; Okafor, E. [RWTH Aachen Univ. (Germany); Buhrow, C. [TU Bergakademie Freiberg Univ. (Germany)

    2006-07-15

    A number of CFD models are currently available to model gaseous explosions in complex geometries. Some of these tools allow the representation of complex environments within hydrocarbon production plants. In certain explosion scenarios, a correction is usually made for the presence of buildings and other complexities by using crude approximations to obtain realistic estimates of explosion behaviour as can be found when predicting the strength of blast waves resulting from initial explosions. With the advance of computational technology, and greater availability of computing power, computational fluid dynamics (CFD) tools are becoming increasingly available for solving such a wide range of explosion problems. A CFD-based explosion code - FLACS can, for instance, be confidently used to understand the impact of blast overpressures in a plant environment consisting of obstacles such as buildings, structures, and pipes. With its porosity concept representing geometry details smaller than the grid, FLACS can represent geometry well, even when using coarse grid resolutions. The performance of FLACS has been evaluated using a wide range of field data. In the present paper, the concept of computational fluid dynamics (CFD) and its application to gas explosion prediction is presented. Furthermore, the predictive capabilities of CFD-based gaseous explosion simulators are demonstrated using FLACS. Details about the FLACS-code, some extensions made to FLACS, model validation exercises, application, and some results from blast load prediction within an industrial facility are presented. (orig.)

  7. A Study On Distributed Model Predictive Consensus

    CERN Document Server

    Keviczky, Tamas

    2008-01-01

    We investigate convergence properties of a proposed distributed model predictive control (DMPC) scheme, where agents negotiate to compute an optimal consensus point using an incremental subgradient method based on primal decomposition as described in Johansson et al. [2006, 2007]. The objective of the distributed control strategy is to agree upon and achieve an optimal common output value for a group of agents in the presence of constraints on the agent dynamics using local predictive controllers. Stability analysis using a receding horizon implementation of the distributed optimal consensus scheme is performed. Conditions are given under which convergence can be obtained even if the negotiations do not reach full consensus.

  8. Electrostatic ion thrusters - towards predictive modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kalentev, O.; Matyash, K.; Duras, J.; Lueskow, K.F.; Schneider, R. [Ernst-Moritz-Arndt Universitaet Greifswald, D-17489 (Germany); Koch, N. [Technische Hochschule Nuernberg Georg Simon Ohm, Kesslerplatz 12, D-90489 Nuernberg (Germany); Schirra, M. [Thales Electronic Systems GmbH, Soeflinger Strasse 100, D-89077 Ulm (Germany)

    2014-02-15

    The development of electrostatic ion thrusters so far has mainly been based on empirical and qualitative know-how, and on evolutionary iteration steps. This resulted in considerable effort regarding prototype design, construction and testing and therefore in significant development and qualification costs and high time demands. For future developments it is anticipated to implement simulation tools which allow for quantitative prediction of ion thruster performance, long-term behavior and space craft interaction prior to hardware design and construction. Based on integrated numerical models combining self-consistent kinetic plasma models with plasma-wall interaction modules a new quality in the description of electrostatic thrusters can be reached. These open the perspective for predictive modeling in this field. This paper reviews the application of a set of predictive numerical modeling tools on an ion thruster model of the HEMP-T (High Efficiency Multi-stage Plasma Thruster) type patented by Thales Electron Devices GmbH. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  9. Genetic models of homosexuality: generating testable predictions.

    Science.gov (United States)

    Gavrilets, Sergey; Rice, William R

    2006-12-22

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality including: (i) chromosomal location, (ii) dominance among segregating alleles and (iii) effect sizes that distinguish between the two major models for their polymorphism: the overdominance and sexual antagonism models. We conclude that the measurement of the genetic characteristics of quantitative trait loci (QTLs) found in genomic screens for genes influencing homosexuality can be highly informative in resolving the form of natural selection maintaining their polymorphism.

  10. ENSO Prediction using Vector Autoregressive Models

    Science.gov (United States)

    Chapman, D. R.; Cane, M. A.; Henderson, N.; Lee, D.; Chen, C.

    2013-12-01

    A recent comparison (Barnston et al, 2012 BAMS) shows the ENSO forecasting skill of dynamical models now exceeds that of statistical models, but the best statistical models are comparable to all but the very best dynamical models. In this comparison the leading statistical model is the one based on the Empirical Model Reduction (EMR) method. Here we report on experiments with multilevel Vector Autoregressive models using only sea surface temperatures (SSTs) as predictors. VAR(L) models generalizes Linear Inverse Models (LIM), which are a VAR(1) method, as well as multilevel univariate autoregressive models. Optimal forecast skill is achieved using 12 to 14 months of prior state information (i.e 12-14 levels), which allows SSTs alone to capture the effects of other variables such as heat content as well as seasonality. The use of multiple levels allows the model advancing one month at a time to perform at least as well for a 6 month forecast as a model constructed to explicitly forecast 6 months ahead. We infer that the multilevel model has fully captured the linear dynamics (cf. Penland and Magorian, 1993 J. Climate). Finally, while VAR(L) is equivalent to L-level EMR, we show in a 150 year cross validated assessment that we can increase forecast skill by improving on the EMR initialization procedure. The greatest benefit of this change is in allowing the prediction to make effective use of information over many more months.

  11. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    R. G. SILVA

    1999-03-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  12. Performance model to predict overall defect density

    Directory of Open Access Journals (Sweden)

    J Venkatesh

    2012-08-01

    Full Text Available Management by metrics is the expectation from the IT service providers to stay as a differentiator. Given a project, the associated parameters and dynamics, the behaviour and outcome need to be predicted. There is lot of focus on the end state and in minimizing defect leakage as much as possible. In most of the cases, the actions taken are re-active. It is too late in the life cycle. Root cause analysis and corrective actions can be implemented only to the benefit of the next project. The focus has to shift left, towards the execution phase than waiting for lessons to be learnt post the implementation. How do we pro-actively predict defect metrics and have a preventive action plan in place. This paper illustrates the process performance model to predict overall defect density based on data from projects in an organization.

  13. Neuro-fuzzy modeling in bankruptcy prediction

    Directory of Open Access Journals (Sweden)

    Vlachos D.

    2003-01-01

    Full Text Available For the past 30 years the problem of bankruptcy prediction had been thoroughly studied. From the paper of Altman in 1968 to the recent papers in the '90s, the progress of prediction accuracy was not satisfactory. This paper investigates an alternative modeling of the system (firm, combining neural networks and fuzzy controllers, i.e. using neuro-fuzzy models. Classical modeling is based on mathematical models that describe the behavior of the firm under consideration. The main idea of fuzzy control, on the other hand, is to build a model of a human control expert who is capable of controlling the process without thinking in a mathematical model. This control expert specifies his control action in the form of linguistic rules. These control rules are translated into the framework of fuzzy set theory providing a calculus, which can stimulate the behavior of the control expert and enhance its performance. The accuracy of the model is studied using datasets from previous research papers.

  14. Pressure prediction model for compression garment design.

    Science.gov (United States)

    Leung, W Y; Yuen, D W; Ng, Sun Pui; Shi, S Q

    2010-01-01

    Based on the application of Laplace's law to compression garments, an equation for predicting garment pressure, incorporating the body circumference, the cross-sectional area of fabric, applied strain (as a function of reduction factor), and its corresponding Young's modulus, is developed. Design procedures are presented to predict garment pressure using the aforementioned parameters for clinical applications. Compression garments have been widely used in treating burning scars. Fabricating a compression garment with a required pressure is important in the healing process. A systematic and scientific design method can enable the occupational therapist and compression garments' manufacturer to custom-make a compression garment with a specific pressure. The objectives of this study are 1) to develop a pressure prediction model incorporating different design factors to estimate the pressure exerted by the compression garments before fabrication; and 2) to propose more design procedures in clinical applications. Three kinds of fabrics cut at different bias angles were tested under uniaxial tension, as were samples made in a double-layered structure. Sets of nonlinear force-extension data were obtained for calculating the predicted pressure. Using the value at 0° bias angle as reference, the Young's modulus can vary by as much as 29% for fabric type P11117, 43% for fabric type PN2170, and even 360% for fabric type AP85120 at a reduction factor of 20%. When comparing the predicted pressure calculated from the single-layered and double-layered fabrics, the double-layered construction provides a larger range of target pressure at a particular strain. The anisotropic and nonlinear behaviors of the fabrics have thus been determined. Compression garments can be methodically designed by the proposed analytical pressure prediction model.

  15. Seasonal Predictability in a Model Atmosphere.

    Science.gov (United States)

    Lin, Hai

    2001-07-01

    The predictability of atmospheric mean-seasonal conditions in the absence of externally varying forcing is examined. A perfect-model approach is adopted, in which a global T21 three-level quasigeostrophic atmospheric model is integrated over 21 000 days to obtain a reference atmospheric orbit. The model is driven by a time-independent forcing, so that the only source of time variability is the internal dynamics. The forcing is set to perpetual winter conditions in the Northern Hemisphere (NH) and perpetual summer in the Southern Hemisphere.A significant temporal variability in the NH 90-day mean states is observed. The component of that variability associated with the higher-frequency motions, or climate noise, is estimated using a method developed by Madden. In the polar region, and to a lesser extent in the midlatitudes, the temporal variance of the winter means is significantly greater than the climate noise, suggesting some potential predictability in those regions.Forecast experiments are performed to see whether the presence of variance in the 90-day mean states that is in excess of the climate noise leads to some skill in the prediction of these states. Ensemble forecast experiments with nine members starting from slightly different initial conditions are performed for 200 different 90-day means along the reference atmospheric orbit. The serial correlation between the ensemble means and the reference orbit shows that there is skill in the 90-day mean predictions. The skill is concentrated in those regions of the NH that have the largest variance in excess of the climate noise. An EOF analysis shows that nearly all the predictive skill in the seasonal means is associated with one mode of variability with a strong axisymmetric component.

  16. Development of an in vitro porcine aorta model to study the stability of stent grafts in motor vehicle accidents.

    Science.gov (United States)

    Darvish, Kurosh; Shafieian, Mehdi; Romanov, Vasily; Rotella, Vittorio; Salvatore, Michael D; Blebea, John

    2009-04-01

    Endovascular stent grafts for the treatment of thoracic aortic aneurysms have become increasingly utilized and yet their locational stability in moderate chest trauma is unknown. A high speed impact system was developed to study the stability of aortic endovascular stent grafts in vitro. A straight segment of porcine descending aorta with stent graft was constrained in a custom-made transparent urethane casing. The specimen was tested in a novel impact system at an anterior inclination of 45 deg and an average deceleration of 55 G, which represented a frontal automobile crash. Due to the shock of the impact, which was shown to be below the threshold of aortic injury, the stent graft moved 0.6 mm longitudinally. This result was repeatable. The presented experimental model may be helpful in developing future grafts to withstand moderate shocks experienced in motor vehicle accidents or other dynamic loadings of the chest.

  17. A kinetic model for predicting biodegradation.

    Science.gov (United States)

    Dimitrov, S; Pavlov, T; Nedelcheva, D; Reuschenbach, P; Silvani, M; Bias, R; Comber, M; Low, L; Lee, C; Parkerton, T; Mekenyan, O

    2007-01-01

    Biodegradation plays a key role in the environmental risk assessment of organic chemicals. The need to assess biodegradability of a chemical for regulatory purposes supports the development of a model for predicting the extent of biodegradation at different time frames, in particular the extent of ultimate biodegradation within a '10 day window' criterion as well as estimating biodegradation half-lives. Conceptually this implies expressing the rate of catabolic transformations as a function of time. An attempt to correlate the kinetics of biodegradation with molecular structure of chemicals is presented. A simplified biodegradation kinetic model was formulated by combining the probabilistic approach of the original formulation of the CATABOL model with the assumption of first order kinetics of catabolic transformations. Nonlinear regression analysis was used to fit the model parameters to OECD 301F biodegradation kinetic data for a set of 208 chemicals. The new model allows the prediction of biodegradation multi-pathways, primary and ultimate half-lives and simulation of related kinetic biodegradation parameters such as biological oxygen demand (BOD), carbon dioxide production, and the nature and amount of metabolites as a function of time. The model may also be used for evaluating the OECD ready biodegradability potential of a chemical within the '10-day window' criterion.

  18. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  19. Disease Prediction Models and Operational Readiness

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Pullum, Laura L.; Hartley, David M.; Benedum, Corey M.; Noonan, Christine F.; Rabinowitz, Peter M.; Lancaster, Mary J.

    2014-03-19

    INTRODUCTION: The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. One of the primary goals of this research was to characterize the viability of biosurveillance models to provide operationally relevant information for decision makers to identify areas for future research. Two critical characteristics differentiate this work from other infectious disease modeling reviews. First, we reviewed models that attempted to predict the disease event, not merely its transmission dynamics. Second, we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). Methods: We searched dozens of commercial and government databases and harvested Google search results for eligible models utilizing terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche-modeling, The publication date of search results returned are bound by the dates of coverage of each database and the date in which the search was performed, however all searching was completed by December 31, 2010. This returned 13,767 webpages and 12,152 citations. After de-duplication and removal of extraneous material, a core collection of 6,503 items was established and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. Next, PNNL’s IN-SPIRE visual analytics software was used to cross-correlate these publications with the definition for a biosurveillance model resulting in the selection of 54 documents that matched the criteria resulting Ten of these documents, However, dealt purely with disease spread models, inactivation of bacteria, or the modeling of human immune system responses to pathogens rather than predicting disease events. As a result, we systematically reviewed 44 papers and the

  20. Using Structural Equation Modeling and the Behavioral Sciences Theories in Predicting Helmet Use

    Directory of Open Access Journals (Sweden)

    Kamarudin Ambak

    2011-01-01

    Full Text Available In Malaysia, according to road accidents data statistics motorcycle users contributes more than 50% of fatalities in traffic accidents, and the major cause due to head injuries. One strategy that can be used to reduce the severity of head injuries is by proper usage of helmet. Although the safety helmet is the best protective equipment to prevents head injury, majority motorcycle user did not use or did not fasten properly. In understanding this problem, the behavioral sciences theory and engineering aspect are needed to provide better explanation and comprehensive insights into solutions. The Theory Planned Behavior (TPB and Health Belief Model (HBM were used in predicting the behavioral intention toward proper helmet usage among motorcyclist. While, a new intervention approach were used in Technology Acceptance Model (TAM that based on the perception of a conceptual system called Safety Helmet Reminder System (SHR. Results show that the constructs variables are reliable and statistically significant with the exogenous and endogenous variables. The full structured models were proposed and tested, thus the significant predictors were identified. A multivariate analysis technique, known as Structural Equation Model (SEM was used in modeling exercise.  Finally, the good-of-fit models were used in interpreting the implication of intervention strategy toward motorcyclist injury prevention program.

  1. Predictive Modeling in Actinide Chemistry and Catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-16

    These are slides from a presentation on predictive modeling in actinide chemistry and catalysis. The following topics are covered in these slides: Structures, bonding, and reactivity (bonding can be quantified by optical probes and theory, and electronic structures and reaction mechanisms of actinide complexes); Magnetic resonance properties (transition metal catalysts with multi-nuclear centers, and NMR/EPR parameters); Moving to more complex systems (surface chemistry of nanomaterials, and interactions of ligands with nanoparticles); Path forward and conclusions.

  2. Multi-physics modelling in the frame of the DRACCAR code development and its application to spent-fuel pool draining accidents

    Energy Technology Data Exchange (ETDEWEB)

    Jacq, F.; Luze, O. de; Guillard, G.; Bascou, S. [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2013-07-01

    To meet the simulation needs of its LOCA R and D program, the IRSN is developing a multi-pin computational tool named DRACCAR. In order to realistically describe the behavior of the reactor core during a Loss Of Coolant Accident (LOCA), modeling has to take into account many coupled phenomena such as thermics (heat generation, radiation, convection and conduction), hydraulics (multi dimensional 1-3 phase flow, shrinkage), mechanics (thermal dilatation, creep, embrittlement) and chemistry (oxidation, oxygen diffusion, hydriding,..). This paper presents several aspects of the DRACCAR code abilities: investigation of the bundle rods strain during a LOCA transient, checking of the thermalhydraulics during reflooding of a partially ballooned bundle, and application to spent-fuel-pool draining accidents in the case of a propagation of the burn front in a typical non axis-symmetrical situation for the thermal heat exchanges which are driving the accident. (orig.)

  3. Self-reported accidents

    DEFF Research Database (Denmark)

    Møller, Katrine Meltofte; Andersen, Camilla Sloth

    2016-01-01

    The main idea behind the self-reporting of accidents is to ask people about their traffic accidents and gain knowledge on these accidents without relying on the official records kept by police and/or hospitals.......The main idea behind the self-reporting of accidents is to ask people about their traffic accidents and gain knowledge on these accidents without relying on the official records kept by police and/or hospitals....

  4. Probabilistic prediction models for aggregate quarry siting

    Science.gov (United States)

    Robinson, G.R.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  5. Predicting Footbridge Response using Stochastic Load Models

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2013-01-01

    Walking parameters such as step frequency, pedestrian mass, dynamic load factor, etc. are basically stochastic, although it is quite common to adapt deterministic models for these parameters. The present paper considers a stochastic approach to modeling the action of pedestrians, but when doing s...... as it pinpoints which decisions to be concerned about when the goal is to predict footbridge response. The studies involve estimating footbridge responses using Monte-Carlo simulations and focus is on estimating vertical structural response to single person loading....

  6. Nonconvex Model Predictive Control for Commercial Refrigeration

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F.S.; Jørgensen, John Bagterp

    2013-01-01

    the iterations, which is more than fast enough to run in real-time. We demonstrate our method on a realistic model, with a full year simulation and 15 minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost...... is to minimize the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost...

  7. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  8. Predictive In Vivo Models for Oncology.

    Science.gov (United States)

    Behrens, Diana; Rolff, Jana; Hoffmann, Jens

    2016-01-01

    Experimental oncology research and preclinical drug development both substantially require specific, clinically relevant in vitro and in vivo tumor models. The increasing knowledge about the heterogeneity of cancer requested a substantial restructuring of the test systems for the different stages of development. To be able to cope with the complexity of the disease, larger panels of patient-derived tumor models have to be implemented and extensively characterized. Together with individual genetically engineered tumor models and supported by core functions for expression profiling and data analysis, an integrated discovery process has been generated for predictive and personalized drug development.Improved “humanized” mouse models should help to overcome current limitations given by xenogeneic barrier between humans and mice. Establishment of a functional human immune system and a corresponding human microenvironment in laboratory animals will strongly support further research.Drug discovery, systems biology, and translational research are moving closer together to address all the new hallmarks of cancer, increase the success rate of drug development, and increase the predictive value of preclinical models.

  9. Economic models of compensation for damages caused by nuclear accidents: some lessons for the revision of the Paris and Vienna Conventions

    Energy Technology Data Exchange (ETDEWEB)

    Faure, Michael G. [Limburg Univ., Maastricht (Netherlands). Faculty of Law

    1995-12-31

    Alternative systems of compensation for damages caused by nuclear accidents have been proposed. In respect, the question merits attention to whether these alternative models of compensation discussed in the economic literature could be implemented when discussing the revision of the Paris and Vienna Conventions. 55 refs., 1 tab.

  10. PRYMA-TO: A model of radionuclide transfer from air into food stuff. Test with data from the Chernobyl accident

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Olivares, A.; Carrasco, E.; Suanez, A.; Josep, L.

    1994-07-01

    This report describes a dynamical model developed in the Environmental Institute of the CIEMAT. Its aims are the calculation of the integrated as well as time-dependent concentrations of ''131l and ''137Cs over time in soils, in forage pasture (or other vegetation species), and in milk and meat. The source contamination is assumed to come from a radioactive cloud confined in the atmospheric mixing layer. Data monitored in different locations the days following the Chernobyl accident have been used. The model was tested against post-Chernobyl data from 13 locations around the world, in the framework of the A4 exercise from the BIOMOVS program (Biospheric Models Validation Studies). The performance of the model is illustrated in 9 scenarios which have been chosen of these 13 because they have more information or they are better described. Default Probability Density Functions for the main parameters used by the model have been obtained by statistical processing of some post-Chernobyl evidence. (Author) 30 refs.

  11. Modelling of the transfer of Cs-137 from air to crops, milk, beef and human body following the Chernobyl accident in a location in Central Bohemia. Test of the model PRYMA T1

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, E.; Garcia-Olivares, A.; Suanez, A.; Robles, B.; Simon, I.; Cancio, D.

    1994-07-01

    This work was made in the frame of the research programme on validation od models for the transfer of radionuclides in the terrestrial, urban and aquatic environments. the acronym of this programme is VAMP (Validation of Model Predictions) and is co-ordinated by the International Atomic energy Agency (IAEA) and the Commission of the european Communities (CEC). The scenario was named CB and Was presented by the Multiple Pathway Working group. the scenario description was at the beginning a blind test, that is without knowing the location or the measured concentrations and doses. the input information included data of contamination in Cs-137 from the Chernobyl accident in Central Europe, in air and soils and ore, description of the scenario (data about crops, cattle, demography, human diet, etc.). the aim of the exercise was the contrast between model results and between observed data and model predictions. In this work the results obtained by the CIEMAT-IMA group of modelers are shown and discussed. (Author) 24 refs.

  12. Predictive modeling by the cerebellum improves proprioception.

    Science.gov (United States)

    Bhanpuri, Nasir H; Okamura, Allison M; Bastian, Amy J

    2013-09-04

    Because sensation is delayed, real-time movement control requires not just sensing, but also predicting limb position, a function hypothesized for the cerebellum. Such cerebellar predictions could contribute to perception of limb position (i.e., proprioception), particularly when a person actively moves the limb. Here we show that human cerebellar patients have proprioceptive deficits compared with controls during active movement, but not when the arm is moved passively. Furthermore, when healthy subjects move in a force field with unpredictable dynamics, they have active proprioceptive deficits similar to cerebellar patients. Therefore, muscle activity alone is likely insufficient to enhance proprioception and predictability (i.e., an internal model of the body and environment) is important for active movement to benefit proprioception. We conclude that cerebellar patients have an active proprioceptive deficit consistent with disrupted movement prediction rather than an inability to generally enhance peripheral proprioceptive signals during action and suggest that active proprioceptive deficits should be considered a fundamental cerebellar impairment of clinical importance.

  13. Estimating the continuous risk of accidents occuring in the mining industry in South Africa

    Directory of Open Access Journals (Sweden)

    Van den Honert, Andrew Francis

    2015-11-01

    Full Text Available This study contributes to the on-going efforts to improve occupational safety in the mining industry by creating a model capable of predicting the continuous risk of occupational accidents occurring. Contributing factors were identified and their sensitivity quantified. The approach included using an Artificial Neural Network (ANN to identify patterns between the input attributes and to predict the continuous risk of accidents occurring. The predictive Artificial Neural Network (ANN model used in this research was created, trained, and validated in the form of a case study with data from a platinum mine near Rustenburg in South Africa. This resulted in meaningful correlation between the predicted continuous risk and actual accidents.

  14. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  15. Detailed source term estimation of the atmospheric release for the Fukushima Daiichi Nuclear Power Station accident by coupling simulations of an atmospheric dispersion model with an improved deposition scheme and oceanic dispersion model

    Science.gov (United States)

    Katata, G.; Chino, M.; Kobayashi, T.; Terada, H.; Ota, M.; Nagai, H.; Kajino, M.; Draxler, R.; Hort, M. C.; Malo, A.; Torii, T.; Sanada, Y.

    2015-01-01

    Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Daiichi Nuclear Power Station (FNPS1) accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate the detailed atmospheric releases during the accident using a reverse estimation method which calculates the release rates of radionuclides by comparing measurements of air concentration of a radionuclide or its dose rate in the environment with the ones calculated by atmospheric and oceanic transport, dispersion and deposition models. The atmospheric and oceanic models used are WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information) and SEA-GEARN-FDM (Finite difference oceanic dispersion model), both developed by the authors. A sophisticated deposition scheme, which deals with dry and fog-water depositions, cloud condensation nuclei (CCN) activation, and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging) for radioactive iodine gas (I2 and CH3I) and other particles (CsI, Cs, and Te), was incorporated into WSPEEDI-II to improve the surface deposition calculations. The results revealed that the major releases of radionuclides due to the FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, midnight of 14 March when the SRV (safety relief valve) was opened three times at Unit 2, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal variation of release rates. The simulation by WSPEEDI-II using the new source term reproduced the local and regional patterns of cumulative

  16. Ground Motion Prediction Models for Caucasus Region

    Science.gov (United States)

    Jorjiashvili, Nato; Godoladze, Tea; Tvaradze, Nino; Tumanova, Nino

    2016-04-01

    Ground motion prediction models (GMPMs) relate ground motion intensity measures to variables describing earthquake source, path, and site effects. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is peak ground acceleration or spectral acceleration because this parameter gives useful information for Seismic Hazard Assessment. Since 2003 development of Georgian Digital Seismic Network has started. In this study new GMP models are obtained based on new data from Georgian seismic network and also from neighboring countries. Estimation of models is obtained by classical, statistical way, regression analysis. In this study site ground conditions are additionally considered because the same earthquake recorded at the same distance may cause different damage according to ground conditions. Empirical ground-motion prediction models (GMPMs) require adjustment to make them appropriate for site-specific scenarios. However, the process of making such adjustments remains a challenge. This work presents a holistic framework for the development of a peak ground acceleration (PGA) or spectral acceleration (SA) GMPE that is easily adjustable to different seismological conditions and does not suffer from the practical problems associated with adjustments in the response spectral domain.

  17. Gamma-Ray Pulsars Models and Predictions

    CERN Document Server

    Harding, A K

    2001-01-01

    Pulsed emission from gamma-ray pulsars originates inside the magnetosphere, from radiation by charged particles accelerated near the magnetic poles or in the outer gaps. In polar cap models, the high energy spectrum is cut off by magnetic pair production above an energy that is dependent on the local magnetic field strength. While most young pulsars with surface fields in the range B = 10^{12} - 10^{13} G are expected to have high energy cutoffs around several GeV, the gamma-ray spectra of old pulsars having lower surface fields may extend to 50 GeV. Although the gamma-ray emission of older pulsars is weaker, detecting pulsed emission at high energies from nearby sources would be an important confirmation of polar cap models. Outer gap models predict more gradual high-energy turnovers at around 10 GeV, but also predict an inverse Compton component extending to TeV energies. Detection of pulsed TeV emission, which would not survive attenuation at the polar caps, is thus an important test of outer gap models. N...

  18. Modeling and Prediction of Krueger Device Noise

    Science.gov (United States)

    Guo, Yueping; Burley, Casey L.; Thomas, Russell H.

    2016-01-01

    This paper presents the development of a noise prediction model for aircraft Krueger flap devices that are considered as alternatives to leading edge slotted slats. The prediction model decomposes the total Krueger noise into four components, generated by the unsteady flows, respectively, in the cove under the pressure side surface of the Krueger, in the gap between the Krueger trailing edge and the main wing, around the brackets supporting the Krueger device, and around the cavity on the lower side of the main wing. For each noise component, the modeling follows a physics-based approach that aims at capturing the dominant noise-generating features in the flow and developing correlations between the noise and the flow parameters that control the noise generation processes. The far field noise is modeled using each of the four noise component's respective spectral functions, far field directivities, Mach number dependencies, component amplitudes, and other parametric trends. Preliminary validations are carried out by using small scale experimental data, and two applications are discussed; one for conventional aircraft and the other for advanced configurations. The former focuses on the parametric trends of Krueger noise on design parameters, while the latter reveals its importance in relation to other airframe noise components.

  19. Artificial Neural Network Model for Predicting Compressive

    Directory of Open Access Journals (Sweden)

    Salim T. Yousif

    2013-05-01

    Full Text Available   Compressive strength of concrete is a commonly used criterion in evaluating concrete. Although testing of the compressive strength of concrete specimens is done routinely, it is performed on the 28th day after concrete placement. Therefore, strength estimation of concrete at early time is highly desirable. This study presents the effort in applying neural network-based system identification techniques to predict the compressive strength of concrete based on concrete mix proportions, maximum aggregate size (MAS, and slump of fresh concrete. Back-propagation neural networks model is successively developed, trained, and tested using actual data sets of concrete mix proportions gathered from literature.    The test of the model by un-used data within the range of input parameters shows that the maximum absolute error for model is about 20% and 88% of the output results has absolute errors less than 10%. The parametric study shows that water/cement ratio (w/c is the most significant factor  affecting the output of the model.     The results showed that neural networks has strong potential as a feasible tool for predicting compressive strength of concrete.

  20. A prediction model for Clostridium difficile recurrence

    Directory of Open Access Journals (Sweden)

    Francis D. LaBarbera

    2015-02-01

    Full Text Available Background: Clostridium difficile infection (CDI is a growing problem in the community and hospital setting. Its incidence has been on the rise over the past two decades, and it is quickly becoming a major concern for the health care system. High rate of recurrence is one of the major hurdles in the successful treatment of C. difficile infection. There have been few studies that have looked at patterns of recurrence. The studies currently available have shown a number of risk factors associated with C. difficile recurrence (CDR; however, there is little consensus on the impact of most of the identified risk factors. Methods: Our study was a retrospective chart review of 198 patients diagnosed with CDI via Polymerase Chain Reaction (PCR from February 2009 to Jun 2013. In our study, we decided to use a machine learning algorithm called the Random Forest (RF to analyze all of the factors proposed to be associated with CDR. This model is capable of making predictions based on a large number of variables, and has outperformed numerous other models and statistical methods. Results: We came up with a model that was able to accurately predict the CDR with a sensitivity of 83.3%, specificity of 63.1%, and area under curve of 82.6%. Like other similar studies that have used the RF model, we also had very impressive results. Conclusions: We hope that in the future, machine learning algorithms, such as the RF, will see a wider application.

  1. Optimal feedback scheduling of model predictive controllers

    Institute of Scientific and Technical Information of China (English)

    Pingfang ZHOU; Jianying XIE; Xiaolong DENG

    2006-01-01

    Model predictive control (MPC) could not be reliably applied to real-time control systems because its computation time is not well defined. Implemented as anytime algorithm, MPC task allows computation time to be traded for control performance, thus obtaining the predictability in time. Optimal feedback scheduling (FS-CBS) of a set of MPC tasks is presented to maximize the global control performance subject to limited processor time. Each MPC task is assigned with a constant bandwidth server (CBS), whose reserved processor time is adjusted dynamically. The constraints in the FSCBS guarantee scheduler of the total task set and stability of each component. The FS-CBS is shown robust against the variation of execution time of MPC tasks at runtime. Simulation results illustrate its effectiveness.

  2. An Anisotropic Hardening Model for Springback Prediction

    Science.gov (United States)

    Zeng, Danielle; Xia, Z. Cedric

    2005-08-01

    As more Advanced High-Strength Steels (AHSS) are heavily used for automotive body structures and closures panels, accurate springback prediction for these components becomes more challenging because of their rapid hardening characteristics and ability to sustain even higher stresses. In this paper, a modified Mroz hardening model is proposed to capture realistic Bauschinger effect at reverse loading, such as when material passes through die radii or drawbead during sheet metal forming process. This model accounts for material anisotropic yield surface and nonlinear isotropic/kinematic hardening behavior. Material tension/compression test data are used to accurately represent Bauschinger effect. The effectiveness of the model is demonstrated by comparison of numerical and experimental springback results for a DP600 straight U-channel test.

  3. Prediction models from CAD models of 3D objects

    Science.gov (United States)

    Camps, Octavia I.

    1992-11-01

    In this paper we present a probabilistic prediction based approach for CAD-based object recognition. Given a CAD model of an object, the PREMIO system combines techniques of analytic graphics and physical models of lights and sensors to predict how features of the object will appear in images. In nearly 4,000 experiments on analytically-generated and real images, we show that in a semi-controlled environment, predicting the detectability of features of the image can successfully guide a search procedure to make informed choices of model and image features in its search for correspondences that can be used to hypothesize the pose of the object. Furthermore, we provide a rigorous experimental protocol that can be used to determine the optimal number of correspondences to seek so that the probability of failing to find a pose and of finding an inaccurate pose are minimized.

  4. Modeling of leachable 137Cs in throughfall and stemflow for Japanese forest canopies after Fukushima Daiichi Nuclear Power Plant accident.

    Science.gov (United States)

    Loffredo, Nicolas; Onda, Yuichi; Kawamori, Ayumi; Kato, Hiroaki

    2014-09-15

    The Fukushima accident dispersed significant amounts of radioactive cesium (Cs) in the landscape. Our research investigated, from June 2011 to November 2013, the mobility of leachable Cs in forests canopies. In particular, (137)Cs and (134)Cs activity concentrations were measured in rainfall, throughfall, and stemflow in broad-leaf and cedar forests in an area located 40 km from the power plant. Leachable (137)Cs loss was modeled by a double exponential (DE) model. This model could not reproduce the variation in activity concentration observed. In order to refine the DE model, the main physical measurable parameters (rainfall intensity, wind velocity, and snowfall occurrence) were assessed, and rainfall was identified as the dominant factor controlling observed variation. A corrective factor was then developed to incorporate rainfall intensity in an improved DE model. With the original DE model, we estimated total (137)Cs loss by leaching from canopies to be 72 ± 4%, 67 ± 4%, and 48 ± 2% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. In contrast, with the improved DE model, the total (137)Cs loss by leaching was estimated to be 34 ± 2%, 34 ± 2%, and 16 ± 1% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. The improved DE model corresponds better to observed data in literature. Understanding (137)Cs and (134)Cs forest dynamics is important for forecasting future contamination of forest soils around the FDNPP. It also provides a basis for understanding forest transfers in future potential nuclear disasters.

  5. Internal Flow Thermal/Fluid Modeling of STS-107 Port Wing in Support of the Columbia Accident Investigation Board

    Science.gov (United States)

    Sharp, John R.; Kittredge, Ken; Schunk, Richard G.

    2003-01-01

    As part of the aero-thermodynamics team supporting the Columbia Accident Investigation Board (CAB), the Marshall Space Flight Center was asked to perform engineering analyses of internal flows in the port wing. The aero-thermodynamics team was split into internal flow and external flow teams with the support being divided between shorter timeframe engineering methods and more complex computational fluid dynamics. In order to gain a rough order of magnitude type of knowledge of the internal flow in the port wing for various breach locations and sizes (as theorized by the CAB to have caused the Columbia re-entry failure), a bulk venting model was required to input boundary flow rates and pressures to the computational fluid dynamics (CFD) analyses. This paper summarizes the modeling that was done by MSFC in Thermal Desktop. A venting model of the entire Orbiter was constructed in FloCAD based on Rockwell International s flight substantiation analyses and the STS-107 reentry trajectory. Chemical equilibrium air thermodynamic properties were generated for SINDA/FLUINT s fluid property routines from a code provided by Langley Research Center. In parallel, a simplified thermal mathematical model of the port wing, including the Thermal Protection System (TPS), was based on more detailed Shuttle re-entry modeling previously done by the Dryden Flight Research Center. Once the venting model was coupled with the thermal model of the wing structure with chemical equilibrium air properties, various breach scenarios were assessed in support of the aero-thermodynamics team. The construction of the coupled model and results are presented herein.

  6. Predictive modelling of ferroelectric tunnel junctions

    Science.gov (United States)

    Velev, Julian P.; Burton, John D.; Zhuravlev, Mikhail Ye; Tsymbal, Evgeny Y.

    2016-05-01

    Ferroelectric tunnel junctions combine the phenomena of quantum-mechanical tunnelling and switchable spontaneous polarisation of a nanometre-thick ferroelectric film into novel device functionality. Switching the ferroelectric barrier polarisation direction produces a sizable change in resistance of the junction—a phenomenon known as the tunnelling electroresistance effect. From a fundamental perspective, ferroelectric tunnel junctions and their version with ferromagnetic electrodes, i.e., multiferroic tunnel junctions, are testbeds for studying the underlying mechanisms of tunnelling electroresistance as well as the interplay between electric and magnetic degrees of freedom and their effect on transport. From a practical perspective, ferroelectric tunnel junctions hold promise for disruptive device applications. In a very short time, they have traversed the path from basic model predictions to prototypes for novel non-volatile ferroelectric random access memories with non-destructive readout. This remarkable progress is to a large extent driven by a productive cycle of predictive modelling and innovative experimental effort. In this review article, we outline the development of the ferroelectric tunnel junction concept and the role of theoretical modelling in guiding experimental work. We discuss a wide range of physical phenomena that control the functional properties of ferroelectric tunnel junctions and summarise the state-of-the-art achievements in the field.

  7. Simple predictions from multifield inflationary models.

    Science.gov (United States)

    Easther, Richard; Frazer, Jonathan; Peiris, Hiranya V; Price, Layne C

    2014-04-25

    We explore whether multifield inflationary models make unambiguous predictions for fundamental cosmological observables. Focusing on N-quadratic inflation, we numerically evaluate the full perturbation equations for models with 2, 3, and O(100) fields, using several distinct methods for specifying the initial values of the background fields. All scenarios are highly predictive, with the probability distribution functions of the cosmological observables becoming more sharply peaked as N increases. For N=100 fields, 95% of our Monte Carlo samples fall in the ranges ns∈(0.9455,0.9534), α∈(-9.741,-7.047)×10-4, r∈(0.1445,0.1449), and riso∈(0.02137,3.510)×10-3 for the spectral index, running, tensor-to-scalar ratio, and isocurvature-to-adiabatic ratio, respectively. The expected amplitude of isocurvature perturbations grows with N, raising the possibility that many-field models may be sensitive to postinflationary physics and suggesting new avenues for testing these scenarios.

  8. Modelling the global atmospheric transport and deposition of radionuclides from the Fukushima Dai-ichi nuclear accident

    Science.gov (United States)

    Christoudias, T.; Lelieveld, J.

    2013-02-01

    We modeled the global atmospheric dispersion and deposition of radionuclides released from the Fukushima Dai-ichi nuclear power plant accident. The EMAC atmospheric chemistry - general circulation model was used, with circulation dynamics nudged towards ERA-Interim reanalysis data. We applied a resolution of approximately 0.5 degrees in latitude and longitude (T255). The model accounts for emissions and transport of the radioactive isotopes 131I and 137Cs, and removal processes through precipitation, particle sedimentation and dry deposition. In addition, we simulated the release of 133Xe, a noble gas that can be regarded as a passive transport tracer of contaminated air. The source terms are based on Chino et al. (2011) and Stohl et al. (2012); especially the emission estimates of 131I are associated with a high degree of uncertainty. The calculated concentrations have been compared to station observations by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO). We calculated that about 80% of the radioactivity from Fukushima which was released to the atmosphere deposited into the Pacific Ocean. In Japan a large inhabited land area was contaminated by more than 40 kBq m-2. We also estimated the inhalation and 50-year dose by 137Cs, 134Cs and 131I to which the people in Japan are exposed.

  9. Predictions of models for environmental radiological assessment

    Energy Technology Data Exchange (ETDEWEB)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa, E-mail: suelip@ird.gov.br, E-mail: dejanira@irg.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Servico de Avaliacao de Impacto Ambiental, Rio de Janeiro, RJ (Brazil); Mahler, Claudio Fernando [Coppe. Instituto Alberto Luiz Coimbra de Pos-Graduacao e Pesquisa de Engenharia, Universidade Federal do Rio de Janeiro (UFRJ) - Programa de Engenharia Civil, RJ (Brazil)

    2011-07-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for {sup 137}Cs and {sup 60}Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  10. Explicit model predictive control accuracy analysis

    OpenAIRE

    Knyazev, Andrew; Zhu, Peizhen; Di Cairano, Stefano

    2015-01-01

    Model Predictive Control (MPC) can efficiently control constrained systems in real-time applications. MPC feedback law for a linear system with linear inequality constraints can be explicitly computed off-line, which results in an off-line partition of the state space into non-overlapped convex regions, with affine control laws associated to each region of the partition. An actual implementation of this explicit MPC in low cost micro-controllers requires the data to be "quantized", i.e. repre...

  11. Hierarchical Model Predictive Control for Resource Distribution

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, K; Stoustrup, Jakob

    2010-01-01

    This paper deals with hierarchichal model predictive control (MPC) of distributed systems. A three level hierachical approach is proposed, consisting of a high level MPC controller, a second level of so-called aggregators, controlled by an online MPC-like algorithm, and a lower level of autonomous...... facilitates plug-and-play addition of subsystems without redesign of any controllers. The method is supported by a number of simulations featuring a three-level smart-grid power control system for a small isolated power grid....

  12. A Modified Model Predictive Control Scheme

    Institute of Scientific and Technical Information of China (English)

    Xiao-Bing Hu; Wen-Hua Chen

    2005-01-01

    In implementations of MPC (Model Predictive Control) schemes, two issues need to be addressed. One is how to enlarge the stability region as much as possible. The other is how to guarantee stability when a computational time limitation exists. In this paper, a modified MPC scheme for constrained linear systems is described. An offline LMI-based iteration process is introduced to expand the stability region. At the same time, a database of feasible control sequences is generated offline so that stability can still be guaranteed in the case of computational time limitations. Simulation results illustrate the effectiveness of this new approach.

  13. Predicting Protein Secondary Structure with Markov Models

    DEFF Research Database (Denmark)

    Fischer, Paul; Larsen, Simon; Thomsen, Claus

    2004-01-01

    we are considering here, is to predict the secondary structure from the primary one. To this end we train a Markov model on training data and then use it to classify parts of unknown protein sequences as sheets, helices or coils. We show how to exploit the directional information contained......The primary structure of a protein is the sequence of its amino acids. The secondary structure describes structural properties of the molecule such as which parts of it form sheets, helices or coils. Spacial and other properties are described by the higher order structures. The classification task...

  14. A Simplified Approach to Estimate the Urban Expressway Capacity after Traffic Accidents Using a Micro-Simulation Model

    Directory of Open Access Journals (Sweden)

    Hong Chen

    2013-01-01

    Full Text Available Based on the decomposition of the evolution processes of the urban expressway capacity after traffic accidents and the influence factors analysis, an approach for estimating the capacity has been proposed. Firstly, the approach introduces the Decision Tree ID algorithm, solves the accident delay time of different accident types by the Information Gain Value, and determines congestion dissipation time by the Traffic Flow Wave Theory. Secondly, taking the accident delay time as the observation cycle, the maximum number of the vehicles through the accident road per unit time was considered as its capacity. Finally, the attenuation simulation of the capacity for different accident types was calculated by the VISSIM software. The simulation results suggest that capacity attenuation of vehicle anchor is minimal and the rate is 30.074%; the next is vehicles fire, rear-end, and roll-over, and the rate is 38.389%, 40.204%, and 43.130%, respectively; the capacity attenuation of vehicle collision is the largest, and the rate is 50.037%. Moreover, the further research shows that the accident delay time is proportional to congestion dissipation time, time difference, and the ratio between them, but it is an inverse relationship with the residual capacity of urban expressway.

  15. Critical conceptualism in environmental modeling and prediction.

    Science.gov (United States)

    Christakos, G

    2003-10-15

    Many important problems in environmental science and engineering are of a conceptual nature. Research and development, however, often becomes so preoccupied with technical issues, which are themselves fascinating, that it neglects essential methodological elements of conceptual reasoning and theoretical inquiry. This work suggests that valuable insight into environmental modeling can be gained by means of critical conceptualism which focuses on the software of human reason and, in practical terms, leads to a powerful methodological framework of space-time modeling and prediction. A knowledge synthesis system develops the rational means for the epistemic integration of various physical knowledge bases relevant to the natural system of interest in order to obtain a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, generate meaningful predictions of environmental processes in space-time, and produce science-based decisions. No restriction is imposed on the shape of the distribution model or the form of the predictor (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated). The scientific reasoning structure underlying knowledge synthesis involves teleologic criteria and stochastic logic principles which have important advantages over the reasoning method of conventional space-time techniques. Insight is gained in terms of real world applications, including the following: the study of global ozone patterns in the atmosphere using data sets generated by instruments on board the Nimbus 7 satellite and secondary information in terms of total ozone-tropopause pressure models; the mapping of arsenic concentrations in the Bangladesh drinking water by assimilating hard and soft data from an extensive network of monitoring wells; and the dynamic imaging of probability distributions of pollutants across the Kalamazoo river.

  16. Continuous-Discrete Time Prediction-Error Identification Relevant for Linear Model Predictive Control

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    model is realized from a continuous-discrete-time linear stochastic system specified using transfer functions with time-delays. It is argued that the prediction-error criterion should be selected such that it is compatible with the objective function of the predictive controller in which the model......A Prediction-error-method tailored for model based predictive control is presented. The prediction-error method studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model. The linear discrete-time stochastic state space...

  17. A Predictive Maintenance Model for Railway Tracks

    DEFF Research Database (Denmark)

    Li, Rui; Wen, Min; Salling, Kim Bang

    2015-01-01

    For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euro per km per year [1]. Aiming to reduce such maintenance expenditure, this paper...... presents a mathematical model based on Mixed Integer Programming (MIP) which is designed to optimize the predictive railway tamping activities for ballasted track for the time horizon up to four years. The objective function is setup to minimize the actual costs for the tamping machine (measured by time...... recovery on the track quality after tamping operation and (5) Tamping machine operation factors. A Danish railway track between Odense and Fredericia with 57.2 km of length is applied for a time period of two to four years in the proposed maintenance model. The total cost can be reduced with up to 50...

  18. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  19. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  20. Predictive Model of Radiative Neutrino Masses

    CERN Document Server

    Babu, K S

    2013-01-01

    We present a simple and predictive model of radiative neutrino masses. It is a special case of the Zee model which introduces two Higgs doublets and a charged singlet. We impose a family-dependent Z_4 symmetry acting on the leptons, which reduces the number of parameters describing neutrino oscillations to four. A variety of predictions follow: The hierarchy of neutrino masses must be inverted; the lightest neutrino mass is extremely small and calculable; one of the neutrino mixing angles is determined in terms of the other two; the phase parameters take CP-conserving values with \\delta_{CP} = \\pi; and the effective mass in neutrinoless double beta decay lies in a narrow range, m_{\\beta \\beta} = (17.6 - 18.5) meV. The ratio of vacuum expectation values of the two Higgs doublets, tan\\beta, is determined to be either 1.9 or 0.19 from neutrino oscillation data. Flavor-conserving and flavor-changing couplings of the Higgs doublets are also determined from neutrino data. The non-standard neutral Higgs bosons, if t...

  1. A predictive model for dimensional errors in fused deposition modeling

    DEFF Research Database (Denmark)

    Stolfi, A.

    2015-01-01

    This work concerns the effect of deposition angle (a) and layer thickness (L) on the dimensional performance of FDM parts using a predictive model based on the geometrical description of the FDM filament profile. An experimental validation over the whole a range from 0° to 177° at 3° steps and two...

  2. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  3. Detailed source term estimation of atmospheric release during the Fukushima Dai-ichi nuclear power plant accident by coupling atmospheric and oceanic dispersion models

    Science.gov (United States)

    Katata, Genki; Chino, Masamichi; Terada, Hiroaki; Kobayashi, Takuya; Ota, Masakazu; Nagai, Haruyasu; Kajino, Mizuo

    2014-05-01

    Temporal variations of release amounts of radionuclides during the Fukushima Dai-ichi Nuclear Power Plant (FNPP1) accident and their dispersion process are essential to evaluate the environmental impacts and resultant radiological doses to the public. Here, we estimated a detailed time trend of atmospheric releases during the accident by combining environmental monitoring data and coupling atmospheric and oceanic dispersion simulations by WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information) and SEA-GEARN developed by the authors. New schemes for wet, dry, and fog depositions of radioactive iodine gas (I2 and CH3I) and other particles (I-131, Te-132, Cs-137, and Cs-134) were incorporated into WSPEEDI-II. The deposition calculated by WSPEEDI-II was used as input data of ocean dispersion calculations by SEA-GEARN. The reverse estimation method based on the simulation by both models assuming unit release rate (1 Bq h-1) was adopted to estimate the source term at the FNPP1 using air dose rate, and air sea surface concentrations. The results suggested that the major release of radionuclides from the FNPP1 occurred in the following periods during March 2011: afternoon on the 12th when the venting and hydrogen explosion occurred at Unit 1, morning on the 13th after the venting event at Unit 3, midnight on the 14th when several openings of SRV (steam relief valve) were conducted at Unit 2, morning and night on the 15th, and morning on the 16th. The modified WSPEEDI-II using the newly estimated source term well reproduced local and regional patterns of air dose rate and surface deposition of I-131 and Cs-137 obtained by airborne observations. Our dispersion simulations also revealed that the highest radioactive contamination areas around FNPP1 were created from 15th to 16th March by complicated interactions among rainfall (wet deposition), plume movements, and phase properties (gas or particle) of I-131 and release rates

  4. Severe accident analysis using dynamic accident progression event trees

    Science.gov (United States)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  5. Two criteria for evaluating risk prediction models.

    Science.gov (United States)

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  6. Methods for Handling Missing Variables in Risk Prediction Models

    NARCIS (Netherlands)

    Held, Ulrike; Kessels, Alfons; Aymerich, Judith Garcia; Basagana, Xavier; ter Riet, Gerben; Moons, Karel G. M.; Puhan, Milo A.

    2016-01-01

    Prediction models should be externally validated before being used in clinical practice. Many published prediction models have never been validated. Uncollected predictor variables in otherwise suitable validation cohorts are the main factor precluding external validation.We used individual patient

  7. Test Data for USEPR Severe Accident Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Rempe

    2007-05-01

    This document identifies data that can be used for assessing various models embodied in severe accident analysis codes. Phenomena considered in this document, which were limited to those anticipated to be of interest in assessing severe accidents in the USEPR developed by AREVA, include: • Fuel Heatup and Melt Progression • Reactor Coolant System (RCS) Thermal Hydraulics • In-Vessel Molten Pool Formation and Heat Transfer • Fuel/Coolant Interactions during Relocation • Debris Heat Loads to the Vessel • Vessel Failure • Molten Core Concrete Interaction (MCCI) and Reactor Cavity Plug Failure • Melt Spreading and Coolability • Hydrogen Control Each section of this report discusses one phenomenon of interest to the USEPR. Within each section, an effort is made to describe the phenomenon and identify what data are available modeling it. As noted in this document, models in US accident analysis codes (MAAP, MELCOR, and SCDAP/RELAP5) differ. Where possible, this report identifies previous assessments that illustrate the impact of modeling differences on predicting various phenomena. Finally, recommendations regarding the status of data available for modeling USEPR severe accident phenomena are summarized.

  8. Radiation protection issues on preparedness and response for a severe nuclear accident: experiences of the Fukushima accident.

    Science.gov (United States)

    Homma, T; Takahara, S; Kimura, M; Kinase, S

    2015-06-01

    Radiation protection issues on preparedness and response for a severe nuclear accident are discussed in this paper based on the experiences following the accident at Fukushima Daiichi nuclear power plant. The criteria for use in nuclear emergencies in the Japanese emergency preparedness guide were based on the recommendations of International Commission of Radiological Protection (ICRP) Publications 60 and 63. Although the decision-making process for implementing protective actions relied heavily on computer-based predictive models prior to the accident, urgent protective actions, such as evacuation and sheltering, were implemented effectively based on the plant conditions. As there were no recommendations and criteria for long-term protective actions in the emergency preparedness guide, the recommendations of ICRP Publications 103, 109, and 111 were taken into consideration in determining the temporary relocation of inhabitants of heavily contaminated areas. These recommendations were very useful in deciding the emergency protective actions to take in the early stages of the Fukushima accident. However, some suggestions have been made for improving emergency preparedness and response in the early stages of a severe nuclear accident.

  9. Estimating the magnitude of prediction uncertainties for the APLE model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  10. Prediction of Catastrophes: an experimental model

    CERN Document Server

    Peters, Randall D; Pomeau, Yves

    2012-01-01

    Catastrophes of all kinds can be roughly defined as short duration-large amplitude events following and followed by long periods of "ripening". Major earthquakes surely belong to the class of 'catastrophic' events. Because of the space-time scales involved, an experimental approach is often difficult, not to say impossible, however desirable it could be. Described in this article is a "laboratory" setup that yields data of a type that is amenable to theoretical methods of prediction. Observations are made of a critical slowing down in the noisy signal of a solder wire creeping under constant stress. This effect is shown to be a fair signal of the forthcoming catastrophe in both of two dynamical models. The first is an "abstract" model in which a time dependent quantity drifts slowly but makes quick jumps from time to time. The second is a realistic physical model for the collective motion of dislocations (the Ananthakrishna set of equations for creep). Hope thus exists that similar changes in the response to ...

  11. Predictive modeling of low solubility semiconductor alloys

    Science.gov (United States)

    Rodriguez, Garrett V.; Millunchick, Joanna M.

    2016-09-01

    GaAsBi is of great interest for applications in high efficiency optoelectronic devices due to its highly tunable bandgap. However, the experimental growth of high Bi content films has proven difficult. Here, we model GaAsBi film growth using a kinetic Monte Carlo simulation that explicitly takes cation and anion reactions into account. The unique behavior of Bi droplets is explored, and a sharp decrease in Bi content upon Bi droplet formation is demonstrated. The high mobility of simulated Bi droplets on GaAsBi surfaces is shown to produce phase separated Ga-Bi droplets as well as depressions on the film surface. A phase diagram for a range of growth rates that predicts both Bi content and droplet formation is presented to guide the experimental growth of high Bi content GaAsBi films.

  12. Distributed model predictive control made easy

    CERN Document Server

    Negenborn, Rudy

    2014-01-01

    The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, traffic and intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems.   This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those ...

  13. Leptogenesis in minimal predictive seesaw models

    Science.gov (United States)

    Björkeroth, Fredrik; de Anda, Francisco J.; de Medeiros Varzielas, Ivo; King, Stephen F.

    2015-10-01

    We estimate the Baryon Asymmetry of the Universe (BAU) arising from leptogenesis within a class of minimal predictive seesaw models involving two right-handed neutrinos and simple Yukawa structures with one texture zero. The two right-handed neutrinos are dominantly responsible for the "atmospheric" and "solar" neutrino masses with Yukawa couplings to ( ν e , ν μ , ν τ ) proportional to (0, 1, 1) and (1, n, n - 2), respectively, where n is a positive integer. The neutrino Yukawa matrix is therefore characterised by two proportionality constants with their relative phase providing a leptogenesis-PMNS link, enabling the lightest right-handed neutrino mass to be determined from neutrino data and the observed BAU. We discuss an SU(5) SUSY GUT example, where A 4 vacuum alignment provides the required Yukawa structures with n = 3, while a {{Z}}_9 symmetry fixes the relatives phase to be a ninth root of unity.

  14. A Review of Accident Modelling Approaches for Complex Critical Sociotechnical Systems

    Science.gov (United States)

    2008-01-01

    integrated model has been demonstrated by analysing the failure in the Therac -25 sociotechnical system. THERAC -25 was an X-ray treatment machine...Baxter (2003) developed a three-layer model for the THERAC -25 system: the regulation authorities, the company who developed the system, and the...FailureFault Error FailureFault Error Failure THERAC -25 FAILURE DESIGN and CERTIFICATION Programmer Company Regulation Authorities Figure 7: Integrating

  15. Accident knowledge and emergency management

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, B.; Groenberg, C.D.

    1997-03-01

    The report contains an overall frame for transformation of knowledge and experience from risk analysis to emergency education. An accident model has been developed to describe the emergency situation. A key concept of this model is uncontrolled flow of energy (UFOE), essential elements are the state, location and movement of the energy (and mass). A UFOE can be considered as the driving force of an accident, e.g., an explosion, a fire, a release of heavy gases. As long as the energy is confined, i.e. the location and movement of the energy are under control, the situation is safe, but loss of confinement will create a hazardous situation that may develop into an accident. A domain model has been developed for representing accident and emergency scenarios occurring in society. The domain model uses three main categories: status, context and objectives. A domain is a group of activities with allied goals and elements and ten specific domains have been investigated: process plant, storage, nuclear power plant, energy distribution, marine transport of goods, marine transport of people, aviation, transport by road, transport by rail and natural disasters. Totally 25 accident cases were consulted and information was extracted for filling into the schematic representations with two to four cases pr. specific domain. (au) 41 tabs., 8 ills.; 79 refs.

  16. Emulation and Sobol' sensitivity analysis of an atmospheric dispersion model applied to the Fukushima nuclear accident

    Science.gov (United States)

    Girard, Sylvain; Mallet, Vivien; Korsakissok, Irène; Mathieu, Anne

    2016-04-01

    Simulations of the atmospheric dispersion of radionuclides involve large uncertainties originating from the limited knowledge of meteorological input data, composition, amount and timing of emissions, and some model parameters. The estimation of these uncertainties is an essential complement to modeling for decision making in case of an accidental release. We have studied the relative influence of a set of uncertain inputs on several outputs from the Eulerian model Polyphemus/Polair3D on the Fukushima case. We chose to use the variance-based sensitivity analysis method of Sobol'. This method requires a large number of model evaluations which was not achievable directly due to the high computational cost of Polyphemus/Polair3D. To circumvent this issue, we built a mathematical approximation of the model using Gaussian process emulation. We observed that aggregated outputs are mainly driven by the amount of emitted radionuclides, while local outputs are mostly sensitive to wind perturbations. The release height is notably influential, but only in the vicinity of the source. Finally, averaging either spatially or temporally tends to cancel out interactions between uncertain inputs.

  17. A RISK MEASUREMENT AND MANAGEMENT MODEL FOR PREVENTING UNMANNED AIR VEHICLE ACCIDENTS

    Directory of Open Access Journals (Sweden)

    Hüdayim BAŞAK

    2008-01-01

    Full Text Available In this study, it is aimed to investigate operationally risky areas by analyzing dangers which can arise during the maintenance and flight activities of Unmanned Air Vehicles (UAVs. For this purpose, a risk analysis methodology was introduced and then within the framework of the application, a sample of risk management model was developed. During the development of the model, personal experiences in the area of UAVs were benefited and a risk management technique consisting of five steps used by pioneering international aviation companies in fliht safety was utilized.

  18. A Predictive Model of Geosynchronous Magnetopause Crossings

    CERN Document Server

    Dmitriev, A; Chao, J -K

    2013-01-01

    We have developed a model predicting whether or not the magnetopause crosses geosynchronous orbit at given location for given solar wind pressure Psw, Bz component of interplanetary magnetic field (IMF) and geomagnetic conditions characterized by 1-min SYM-H index. The model is based on more than 300 geosynchronous magnetopause crossings (GMCs) and about 6000 minutes when geosynchronous satellites of GOES and LANL series are located in the magnetosheath (so-called MSh intervals) in 1994 to 2001. Minimizing of the Psw required for GMCs and MSh intervals at various locations, Bz and SYM-H allows describing both an effect of magnetopause dawn-dusk asymmetry and saturation of Bz influence for very large southward IMF. The asymmetry is strong for large negative Bz and almost disappears when Bz is positive. We found that the larger amplitude of negative SYM-H the lower solar wind pressure is required for GMCs. We attribute this effect to a depletion of the dayside magnetic field by a storm-time intensification of t...

  19. Remaining Useful Lifetime (RUL - Probabilistic Predictive Model

    Directory of Open Access Journals (Sweden)

    Ephraim Suhir

    2011-01-01

    Full Text Available Reliability evaluations and assurances cannot be delayed until the device (system is fabricated and put into operation. Reliability of an electronic product should be conceived at the early stages of its design; implemented during manufacturing; evaluated (considering customer requirements and the existing specifications, by electrical, optical and mechanical measurements and testing; checked (screened during manufacturing (fabrication; and, if necessary and appropriate, maintained in the field during the product’s operation Simple and physically meaningful probabilistic predictive model is suggested for the evaluation of the remaining useful lifetime (RUL of an electronic device (system after an appreciable deviation from its normal operation conditions has been detected, and the increase in the failure rate and the change in the configuration of the wear-out portion of the bathtub has been assessed. The general concepts are illustrated by numerical examples. The model can be employed, along with other PHM forecasting and interfering tools and means, to evaluate and to maintain the high level of the reliability (probability of non-failure of a device (system at the operation stage of its lifetime.

  20. 气象因素对电力安全事故影响的模型%Research on Impact Model of Meteorological Factors on the Power Accidents

    Institute of Scientific and Technical Information of China (English)

    李彦斌; 韩颖; 张嵘; 李彦国

    2013-01-01

      以我国南方某地区为例,收集了该地区连续48个月的电力事故数据及其对应的15个气象要素资料。首先用因子分析法消除了15个气象要素的多重共线性,提取了温度因子、降水因子、湿度因子、风力因子4类主要因素,其次应用Logistic回归建立了气象因素对电力事故的影响模型。模型探究了气象条件与电力事故的内在联系,并用2010年的检验样本验证了模型拟合的准确性,为电力事故预警机制的建立进行了积极的探讨。%Taking a certain region in South China for example, the power system accident data in continuous 48 months of this region and corresponding data of 15 meteorological factors are collected. Firstly, the multicollinearity of the 15 meteorological factors are eliminated by factor analysis, and four kinds of main factors such as temperature factor, precipitation factor, humidity factor and wind power factor are extracted;secondly, an impact model of meteorological factors on power system accidents is established by Logistic regression. Using the established impact model the internal relations between meteorological conditions and power system accidents are explored, and the fitting accuracy of the proposed model is validated by test samples in 2010. The proposed impact model is available for reference to the establishment of early warning mechanism for power system accidents.

  1. Predicting expressway crash frequency using a random effect negative binomial model: A case study in China.

    Science.gov (United States)

    Ma, Zhuanglin; Zhang, Honglu; Chien, Steven I-Jy; Wang, Jin; Dong, Chunjiao

    2017-01-01

    To investigate the relationship between crash frequency and potential influence factors, the accident data for events occurring on a 50km long expressway in China, including 567 crash records (2006-2008), were collected and analyzed. Both the fixed-length and the homogeneous longitudinal grade methods were applied to divide the study expressway section into segments. A negative binomial (NB) model and a random effect negative binomial (RENB) model were developed to predict crash frequency. The parameters of both models were determined using the maximum likelihood (ML) method, and the mixed stepwise procedure was applied to examine the significance of explanatory variables. Three explanatory variables, including longitudinal grade, road width, and ratio of longitudinal grade and curve radius (RGR), were found as significantly affecting crash frequency. The marginal effects of significant explanatory variables to the crash frequency were analyzed. The model performance was determined by the relative prediction error and the cumulative standardized residual. The results show that the RENB model outperforms the NB model. It was also found that the model performance with the fixed-length segment method is superior to that with the homogeneous longitudinal grade segment method.

  2. Radioactive materials transport accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    McSweeney, T.I.; Maheras, S.J.; Ross, S.B. [Battelle Memorial Inst. (United States)

    2004-07-01

    Over the last 25 years, one of the major issues raised regarding radioactive material transportation has been the risk of severe accidents. While numerous studies have shown that traffic fatalities dominate the risk, modeling the risk of severe accidents has remained one of the most difficult analysis problems. This paper will show how models that were developed for nuclear spent fuel transport accident analysis can be adopted to obtain estimates of release fractions for other types of radioactive material such as vitrified highlevel radioactive waste. The paper will also show how some experimental results from fire experiments involving low level waste packaging can be used in modeling transport accident analysis with this waste form. The results of the analysis enable an analyst to clearly show the differences in the release fractions as a function of accident severity. The paper will also show that by placing the data in a database such as ACCESS trademark, it is possible to obtain risk measures for transporting the waste forms along proposed routes from the generator site to potential final disposal sites.

  3. Risk-based modeling of early warning systems for pollution accidents.

    Science.gov (United States)

    Grayman, W M; Males, R M

    2002-01-01

    An early warning system is a mechanism for detecting, characterizing and providing notification of a source water contamination event (spill event) in order to mitigate the impact of contamination. Spill events are highly probabilistic occurrences with major spills, which can have very significant impacts on raw water sources of drinking water, being relatively rare. A systematic method for designing and operating early warning systems that considers the highly variable, probabilistic nature of many aspects of the system is described. The methodology accounts for the probability of spills, behavior of monitoring equipment, variable hydrology, and the probability of obtaining information about spills independent of a monitoring system. Spill Risk, a risk-based model using Monte Carlo simulation techniques has been developed and its utility has been demonstrated as part of an AWWA Research Foundation sponsored project. The model has been applied to several hypothetical river situations and to an actual section of the Ohio River. Additionally, the model has been systematically applied to a wide range of conditions in order to develop general guidance on design of early warning systems.

  4. A cross-scale numerical modeling system for management support of oil spill accidents.

    Science.gov (United States)

    Azevedo, Alberto; Oliveira, Anabela; Fortunato, André B; Zhang, Joseph; Baptista, António M

    2014-03-15

    A flexible 2D/3D oil spill modeling system addressing the distinct nature of the surface and water column fluids, major oil weathering and improved retention/reposition processes in coastal zones is presented. The system integrates hydrodynamic, transport and oil weathering modules, which can be combined to offer different-complexity descriptions as required by applications across the river-to-ocean continuum. Features include accounting for different composition and reology in the surface and water column mixtures, as well as spreading, evaporation, water-in-oil emulsification, shoreline retention, dispersion and dissolution. The use of unstructured grids provides flexibility and efficiency in handling spills in complex geometries and across scales. The use of high-order Eulerian-Lagrangian methods allows for computational efficiency and for handling key processes in ways consistent with their distinct mathematical nature and time scales. The modeling system is tested through a suite of synthetic, laboratory and realistic-domain benchmarks, which demonstrate robust handling of key processes and of 2D/3D couplings. The application of the modeling system to a spill scenario at the entrance of a port in a coastal lagoon illustrates the power of the approach to represent spills that occur in coastal regions with complex boundaries and bathymetry.

  5. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  6. Systematics of Reconstructed Process Facility Criticality Accidents

    Energy Technology Data Exchange (ETDEWEB)

    Pruvost, N.L.; McLaughlin, T.P.; Monahan, S.P.

    1999-09-19

    The systematics of the characteristics of twenty-one criticality accidents occurring in nuclear processing facilities of the Russian Federation, the United States, and the United Kingdom are examined. By systematics the authors mean the degree of consistency or agreement between the factual parameters reported for the accidents and the experimentally known conditions for criticality. The twenty-one reported process criticality accidents are not sufficiently well described to justify attempting detailed neutronic modeling. However, results of classic hand calculations confirm the credibility of the reported accident conditions.

  7. Accident reduction factors and causal inference in traffic safety studies: a review.

    Science.gov (United States)

    Davis, G A

    2000-01-01

    Accident reduction factors are used to predict the change in accident occurrence which a countermeasure can be expected to cause. Since ethical and legal obstacles preclude the use of randomized experiments when evaluating traffic safety improvements, empirical support for the causal effectiveness of accident countermeasures comes entirely from observational studies. Drawing on developments in causal inference initiated by Donald Rubin, it is argued here that the mechanism by which sites are selected for application of a countermeasure should be included as part of a study's data model, and that when important features of the selection mechanism are neglected, existing methods for estimating accident reduction factors become inconsistent. A promising, but neglected, way out of these difficulties lies in developing rational countermeasure selection methods which also support valid causal inference of countermeasure effects.

  8. RFI modeling and prediction approach for SATOP applications: RFI prediction models

    Science.gov (United States)

    Nguyen, Tien M.; Tran, Hien T.; Wang, Zhonghai; Coons, Amanda; Nguyen, Charles C.; Lane, Steven A.; Pham, Khanh D.; Chen, Genshe; Wang, Gang

    2016-05-01

    This paper describes a technical approach for the development of RFI prediction models using carrier synchronization loop when calculating Bit or Carrier SNR degradation due to interferences for (i) detecting narrow-band and wideband RFI signals, and (ii) estimating and predicting the behavior of the RFI signals. The paper presents analytical and simulation models and provides both analytical and simulation results on the performance of USB (Unified S-Band) waveforms in the presence of narrow-band and wideband RFI signals. The models presented in this paper will allow the future USB command systems to detect the RFI presence, estimate the RFI characteristics and predict the RFI behavior in real-time for accurate assessment of the impacts of RFI on the command Bit Error Rate (BER) performance. The command BER degradation model presented in this paper also allows the ground system operator to estimate the optimum transmitted SNR to maintain a required command BER level in the presence of both friendly and un-friendly RFI sources.

  9. Global and local cancer risks after the Fukushima Nuclear Power Plant accident as seen from Chernobyl: a modeling study for radiocaesium ((134)Cs &(137)Cs).

    Science.gov (United States)

    Evangeliou, Nikolaos; Balkanski, Yves; Cozic, Anne; Møller, Anders Pape

    2014-03-01

    The accident at the Fukushima Daiichi Nuclear Power Plant (NPP) in Japan resulted in the release of a large number of fission products that were transported worldwide. We study the effects of two of the most dangerous radionuclides emitted, (137)Cs (half-life: 30.2years) and (134)Cs (half-life: 2.06years), which were transported across the world constituting the global fallout (together with iodine isotopes and noble gasses) after nuclear releases. The main purpose is to provide preliminary cancer risk estimates after the Fukushima NPP accident, in terms of excess lifetime incident and death risks, prior to epidemiology, and compare them with those occurred after the Chernobyl accident. Moreover, cancer risks are presented for the local population in the form of high-resolution risk maps for 3 population classes and for both sexes. The atmospheric transport model LMDZORINCA was used to simulate the global dispersion of radiocaesium after the accident. Air and ground activity concentrations have been incorporated with monitoring data as input to the LNT-model (Linear Non-Threshold) frequently used in risk assessments of all solid cancers. Cancer risks were estimated to be small for the global population in regions outside Japan. Women are more sensitive to radiation than men, although the largest risks were recorded for infants; the risk is not depended on the sex at the age-at-exposure. Radiation risks from Fukushima were more enhanced near the plant, while the evacuation measures were crucial for its reduction. According to our estimations, 730-1700 excess cancer incidents are expected of which around 65% may be fatal, which are very close to what has been already published (see references therein). Finally, we applied the same calculations using the DDREF (Dose and Dose Rate Effectiveness Factor), which is recommended by the ICRP, UNSCEAR and EPA as an alternative reduction factor instead of using a threshold value (which is still unknown). Excess lifetime cancer

  10. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to unders

  11. Predictability of the Indian Ocean Dipole in the coupled models

    Science.gov (United States)

    Liu, Huafeng; Tang, Youmin; Chen, Dake; Lian, Tao

    2017-03-01

    In this study, the Indian Ocean Dipole (IOD) predictability, measured by the Indian Dipole Mode Index (DMI), is comprehensively examined at the seasonal time scale, including its actual prediction skill and potential predictability, using the ENSEMBLES multiple model ensembles and the recently developed information-based theoretical framework of predictability. It was found that all model predictions have useful skill, which is normally defined by the anomaly correlation coefficient larger than 0.5, only at around 2-3 month leads. This is mainly because there are more false alarms in predictions as leading time increases. The DMI predictability has significant seasonal variation, and the predictions whose target seasons are boreal summer (JJA) and autumn (SON) are more reliable than that for other seasons. All of models fail to predict the IOD onset before May and suffer from the winter (DJF) predictability barrier. The potential predictability study indicates that, with the model development and initialization improvement, the prediction of IOD onset is likely to be improved but the winter barrier cannot be overcome. The IOD predictability also has decadal variation, with a high skill during the 1960s and the early 1990s, and a low skill during the early 1970s and early 1980s, which is very consistent with the potential predictability. The main factors controlling the IOD predictability, including its seasonal and decadal variations, are also analyzed in this study.

  12. Associating Crash Avoidance Maneuvers with Driver Attributes and Accident Characteristics: A Mixed Logit Model Approach

    DEFF Research Database (Denmark)

    Kaplan, Sigal; Prato, Carlo Giacomo

    2012-01-01

    This study focuses on the propensity of drivers to engage in crash avoidance maneuvers in relation to driver attributes, critical events, crash characteristics, vehicles involved, road characteristics and environmental attributes. Five alternative actions involving emergency lateral and speed...... as from the key role of the ability of drivers to perform effective corrective maneuvers for the success of automated in-vehicle warning and driver assistance systems. The analysis is conducted by means of a mixed logit model that accommodates correlations across alternatives and heteroscedasticity. Data...... for the analysis are retrieved from the General Estimates System (GES) crash database for the year 2009. Results show that (i) the nature of the critical event that made the crash imminent influences the choice of crash avoidance maneuvers, (ii) women and elderly have a lower propensity to conduct crash avoidance...

  13. Foundation Settlement Prediction Based on a Novel NGM Model

    Directory of Open Access Journals (Sweden)

    Peng-Yu Chen

    2014-01-01

    Full Text Available Prediction of foundation or subgrade settlement is very important during engineering construction. According to the fact that there are lots of settlement-time sequences with a nonhomogeneous index trend, a novel grey forecasting model called NGM (1,1,k,c model is proposed in this paper. With an optimized whitenization differential equation, the proposed NGM (1,1,k,c model has the property of white exponential law coincidence and can predict a pure nonhomogeneous index sequence precisely. We used two case studies to verify the predictive effect of NGM (1,1,k,c model for settlement prediction. The results show that this model can achieve excellent prediction accuracy; thus, the model is quite suitable for simulation and prediction of approximate nonhomogeneous index sequence and has excellent application value in settlement prediction.

  14. Severe accident simulation at Olkiuoto

    Energy Technology Data Exchange (ETDEWEB)

    Tirkkonen, H.; Saarenpaeae, T. [Teollisuuden Voima Oy (TVO), Olkiluoto (Finland); Cliff Po, L.C. [Micro-Simulation Technology, Montville, NJ (United States)

    1995-09-01

    A personal computer-based simulator was developed for the Olkiluoto nuclear plant in Finland for training in severe accident management. The generic software PCTRAN was expanded to model the plant-specific features of the ABB Atom designed BWR including its containment over-pressure protection and filtered vent systems. Scenarios including core heat-up, hydrogen generation, core melt and vessel penetration were developed in this work. Radiation leakage paths and dose rate distribution are presented graphically for operator use in diagnosis and mitigation of accidents. Operating on an graphically for operator use in diagnosis and mitigation of accidents. Operating on an 486 DX2-66, PCTRAN-TVO achieves a speed about 15 times faster than real-time. A convenient and user-friendly graphic interface allows full interactive control. In this paper a review of the component models and verification runs are presented.

  15. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation: Part 2, Scientific bases for health effects models

    Energy Technology Data Exchange (ETDEWEB)

    Abrahamson, S.; Bender, M.; Book, S.; Buncher, C.; Denniston, C.; Gilbert, E.; Hahn, F.; Hertzberg, V.; Maxon, H.; Scott, B.

    1989-05-01

    This report provides dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Two-parameter Weibull hazard functions are recommended for estimating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary and gastrointestinal syndromes -- are considered. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid and ''other''. The category, ''other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also provided. For most cancers, both incidence and mortality are addressed. Linear and linear-quadratic models are also recommended for assessing genetic risks. Five classes of genetic disease -- dominant, x-linked, aneuploidy, unbalanced translocation and multifactorial diseases --are considered. In addition, the impact of radiation-induced genetic damage on the incidence of peri-implantation embryo losses is discussed. The uncertainty in modeling radiological health risks is addressed by providing central, upper, and lower estimates of all model parameters. Data are provided which should enable analysts to consider the timing and severity of each type of health risk. 22 refs., 14 figs., 51 tabs.

  16. Nonconvex model predictive control for commercial refrigeration

    Science.gov (United States)

    Gybel Hovgaard, Tobias; Boyd, Stephen; Larsen, Lars F. S.; Bagterp Jørgensen, John

    2013-08-01

    We consider the control of a commercial multi-zone refrigeration system, consisting of several cooling units that share a common compressor, and is used to cool multiple areas or rooms. In each time period we choose cooling capacity to each unit and a common evaporation temperature. The goal is to minimise the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimisation method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out the iterations, which is more than fast enough to run in real time. We demonstrate our method on a realistic model, with a full year simulation and 15-minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost savings, on the order of 30%, compared to a standard thermostat-based control system. Perhaps more important, we see that the method exhibits sophisticated response to real-time variations in electricity prices. This demand response is critical to help balance real-time uncertainties in generation capacity associated with large penetration of intermittent renewable energy sources in a future smart grid.

  17. QSPR Models for Octane Number Prediction

    Directory of Open Access Journals (Sweden)

    Jabir H. Al-Fahemi

    2014-01-01

    Full Text Available Quantitative structure-property relationship (QSPR is performed as a means to predict octane number of hydrocarbons via correlating properties to parameters calculated from molecular structure; such parameters are molecular mass M, hydration energy EH, boiling point BP, octanol/water distribution coefficient logP, molar refractivity MR, critical pressure CP, critical volume CV, and critical temperature CT. Principal component analysis (PCA and multiple linear regression technique (MLR were performed to examine the relationship between multiple variables of the above parameters and the octane number of hydrocarbons. The results of PCA explain the interrelationships between octane number and different variables. Correlation coefficients were calculated using M.S. Excel to examine the relationship between multiple variables of the above parameters and the octane number of hydrocarbons. The data set was split into training of 40 hydrocarbons and validation set of 25 hydrocarbons. The linear relationship between the selected descriptors and the octane number has coefficient of determination (R2=0.932, statistical significance (F=53.21, and standard errors (s =7.7. The obtained QSPR model was applied on the validation set of octane number for hydrocarbons giving RCV2=0.942 and s=6.328.

  18. Iodine behaviour in severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, L.M.C.; Grindon, E.; Handy, B.J.; Sutherland, L. [NNC Ltd., Knutsford (United Kingdom); Bruns, W.G.; Sims, H.E. [AEA Technology, Harwell (United Kingdom); Dickinson, S. [AEA Technology, Winfrith (United Kingdom); Hueber, C.; Jacquemain, D. [IPSN/CEA, Cadarache, Saint Paul-Lez-Durance (France)

    1996-12-01

    A description is given of analyses which identify which aspects of the modelling and data are most important in evaluating the release of radioactive iodine to the environment following a potential severe accident at a PWR and which identify the major uncertainties which affect that release. Three iodine codes are used namely INSPECT, IODE and IMPAIR, and their predictions are compared with those of the PSA code MAAP. INSPECT is a mechanistic code which models iodine behaviour in the aqueous aerosol, spray water and sump water, and the partitioning of volatile species between the aqueous phases and containment gas space. Organic iodine is not modelled. IODE and IMPAIR are semi-empirical codes which do not model iodine behaviour in the aqueous aerosol, but model organic iodine. The fault sequences addressed are based on analyses for the Sizewell `B` design. Two types of sequence have been analysed.: (a) those in which a major release of fission products from the primary circuit to the containment occur, e.g. a large LOCAS, (b) those where the release by-passes the containment, e.g. a leak into the auxiliary building. In the analysis of the LOCA sequences where the pH of the sump is controlled to be a value of 8 or greater, all three codes predict that the oxidation of iodine to produce gas phase species does not make a significant contribution to the source term due to leakage from the reactor building and that the latter is dominated by iodide in the aerosol. In the case where the pH of the sump is not controlled, it is found that the proportion of gas phase iodine increases significantly, although the cumulative leakage predicted by all three codes is not significantly different from that predicted by MAAP. The radiolytic production of nitric acid could be a major factor in determining the pH, and if the pH were reduced, the codes predict an increase in gas phase iodine species leaked from the containment. (author) 4 figs., 7 tabs., 13 refs.

  19. Review of current severe accident management approaches in Europe and identification of related modelling requirements for the computer code ASTEC V2.1

    Energy Technology Data Exchange (ETDEWEB)

    Hermsmeyer, S. [European Commission JRC, Petten (Netherlands). Inst. for Energy and Transport; Herranz, L.E.; Iglesias, R. [CIEMAT, Madrid (Spain); and others

    2015-07-15

    The severe accident at the Fukushima-Daiichi nuclear power plant (NPP) has led to a worldwide review of nuclear safety approaches and is bringing a refocussing of R and D in the field. To support these efforts several new Euratom FP7 projects have been launched. The CESAM project focuses on the improvement of the ASTEC computer code. ASTEC is jointly developed by IRSN and GRS and is considered as the European reference code for Severe Accident Analyses since it capitalizes knowledge from the extensive Euro-pean R and D in the field. The project aims at the code's enhancement and extension for use in Severe Accident Management (SAM) analysis of the NPPs of Generation II-III presently under operation or foreseen in the near future in Europe, spent fuel pools included. The work reported here is concerned with the importance, for the further development of the code, of SAM strategies to be simulated. To this end, SAM strategies applied in the EU have been compiled. This compilation is mainly based on the public information made available in the frame of the EU ''stress tests'' for NPPs and has been complemented by information pro-vided by the different CESAM partners. The context of SAM is explained and the strategies are presented. The modelling capabilities for the simulation of these strategies in the current production version 2.0 of ASTEC are discussed. Furthermore, the requirements for the next version of ASTEC V2.1 that is supported in the CESAM project are highlighted. They are a necessary complement to the list of code improvements that is drawn from consolidating new fields of application, like SFP and BWR model enhancements, and from new experimental results on severe accident phenomena.

  20. Validation of a loss of vacuum accident (LOVA) Computational Fluid Dynamics (CFD) model

    Energy Technology Data Exchange (ETDEWEB)

    Bellecci, C.; Gaudio, P. [EURATOM-Faculty of Engineering, University of Rome ' Tor Vergata' Via del Politecnico 1, 00133 Rome (Italy); Lupelli, I., E-mail: ivan.lupelli@uniroma2.it [EURATOM-Faculty of Engineering, University of Rome ' Tor Vergata' Via del Politecnico 1, 00133 Rome (Italy); Malizia, A. [EURATOM-Faculty of Engineering, University of Rome ' Tor Vergata' Via del Politecnico 1, 00133 Rome (Italy); Porfiri, M.T. [ENEA Nuclear Fusion Tecnologies, Via Enrico Fermi, 45 I-00044 Frascati (Italy); Quaranta, R.; Richetta, M. [EURATOM-Faculty of Engineering, University of Rome ' Tor Vergata' Via del Politecnico 1, 00133 Rome (Italy)

    2011-10-15

    Intense thermal loads in fusion devices occur during plasma disruptions, Edge Localized Modes (ELM) and Vertical Displacement Events (VDE). They will result in macroscopic erosion of the plasma facing materials and consequent accumulation of activated dust into the ITER Vacuum Vessel (VV). A recognized safety issue for future fusion reactors fueled with deuterium and tritium is the generation of sizeable quantities of dust. In case of LOVA, air inlet occurs due to the pressure difference between the atmospheric condition and the internal condition. It causes mobilization of the dust that can exit the VV threatening public safety because it may contain tritium, may be radioactive from activation products, and may be chemically reactive and/or toxic (Sharpe et al.; Sharpe and Humrickhouse). Several experiments have been conducted with STARDUST facility in order to reproduce a low pressurization rate (300 Pa/s) LOVA event in ITER due to a small air leakage for two different positions of the leak, at the equatorial port level and at the divertor port level, in order to evaluate the velocity magnitude in case of a LOVA that is strictly connected with dust mobilization phenomena. A two-dimensional (2D) modelling of STARDUST, made with the CFD commercial code FLUENT, has been carried out. The results of these simulations were compared against the experimental data for CFD code validation. For validation purposes, the CFD simulation data were extracted at the same locations as the experimental data were collected. In this paper, the authors present and discuss the computer-simulation data and compare them with data collected during the laboratory studies at the University of Rome 'Tor Vergata' Quantum Electronics and Plasmas lab.

  1. Outline of the Desktop Severe Accident Graphic Simulator Module for OPR-1000

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. Y.; Ahn, K. I. [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    This paper introduce the desktop severe accident graphic simulator module (VMAAP) which is a window-based severe accident simulator using MAAP as its engine. The VMAAP is one of the submodules in SAMEX system (Severe Accident Management Support Expert System) which is a decision support system for use in a severe accident management following an incident at a nuclear power plant. The SAMEX system consists of four major modules as sub-systems: (a) Severe accident risk data base module (SARDB): stores the data of integrated severe accident analysis code results like MAAP and MELCOR for hundreds of high frequency scenarios for the reference plant; (b) Risk-informed severe accident risk data base management module (RI-SARD): provides a platform to identify the initiating event, determine plant status and equipment availability, diagnoses the status of the reactor core, reactor vessel and containment building, and predicts the plant behaviors; (c) Severe accident management simulator module (VMAAP): runs the MAAP4 code with user friendly graphic interface for input deck and output display; (d) On-line severe accident management guidance module (On-line SAMG); provides available accident management strategies with an electronic format. The role of VMAAP in SAMEX can be described as followings. SARDB contains the most of high frequency scenarios based on a level 2 probabilistic safety analysis. Therefore, there is good chance that a real accident sequence is similar to one of the data base cases. In such a case, RI-SARD can predict an accident progression by a scenario-base or symptom-base search depends on the available plant parameter information. Nevertheless, there still may be deviations or variations between the actual scenario and the data base scenario. The deviations can be decreased by using a real-time graphic accident simulator, VMAAP.. VMAAP is a MAAP4-based severe accident simulation model for OPR-1000 plant. It can simulate spectrum of physical processes

  2. Predictability in models of the atmospheric circulation.

    NARCIS (Netherlands)

    Houtekamer, P.L.

    1992-01-01

    It will be clear from the above discussions that skill forecasts are still in their infancy. Operational skill predictions do not exist. One is still struggling to prove that skill predictions, at any range, have any quality at all. It is not clear what the statistics of the analysis error are. The

  3. Simulation of containment pressurization in a large break-loss of coolant accident using single-cell and multicell models and CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Kalkahoran, Omid Noori; Ahangari, Rohollah [Reactor Research School, Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of); Shirani, Amir Saied [Faculty of Engineering, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)

    2016-10-15

    Since the inception of nuclear power as a commercial energy source, safety has been recognized as a prime consideration in the design, construction, operation, maintenance, and decommissioning of nuclear power plants. The release of radioactivity to the environment requires the failure of multiple safety systems and the breach of three physical barriers: fuel cladding, the reactor cooling system, and containment. In this study, nuclear reactor containment pressurization has been modeled in a large break-loss of coolant accident (LB-LOCA) by programming single-cell and multicell models in MATLAB. First, containment has been considered as a control volume (single-cell model). In addition, spray operation has been added to this model. In the second step, the single-cell model has been developed into a multicell model to consider the effects of the nodalization and spatial location of cells in the containment pressurization in comparison with the single-cell model. In the third step, the accident has been simulated using the CONTAIN 2.0 code. Finally, Bushehr nuclear power plant (BNPP) containment has been considered as a case study. The results of BNPP containment pressurization due to LB-LOCA have been compared between models, final safety analysis report, and CONTAIN code's results.

  4. Allostasis: a model of predictive regulation.

    Science.gov (United States)

    Sterling, Peter

    2012-04-12

    The premise of the standard regulatory model, "homeostasis", is flawed: the goal of regulation is not to preserve constancy of the internal milieu. Rather, it is to continually adjust the milieu to promote survival and reproduction. Regulatory mechanisms need to be efficient, but homeostasis (error-correction by feedback) is inherently inefficient. Thus, although feedbacks are certainly ubiquitous, they could not possibly serve as the primary regulatory mechanism. A newer model, "allostasis", proposes that efficient regulation requires anticipating needs and preparing to satisfy them before they arise. The advantages: (i) errors are reduced in magnitude and frequency; (ii) response capacities of different components are matched -- to prevent bottlenecks and reduce safety factors; (iii) resources are shared between systems to minimize reserve capacities; (iv) errors are remembered and used to reduce future errors. This regulatory strategy requires a dedicated organ, the brain. The brain tracks multitudinous variables and integrates their values with prior knowledge to predict needs and set priorities. The brain coordinates effectors to mobilize resources from modest bodily stores and enforces a system of flexible trade-offs: from each organ according to its ability, to each organ according to its need. The brain also helps regulate the internal milieu by governing anticipatory behavior. Thus, an animal conserves energy by moving to a warmer place - before it cools, and it conserves salt and water by moving to a cooler one before it sweats. The behavioral strategy requires continuously updating a set of specific "shopping lists" that document the growing need for each key component (warmth, food, salt, water). These appetites funnel into a common pathway that employs a "stick" to drive the organism toward filling the need, plus a "carrot" to relax the organism when the need is satisfied. The stick corresponds broadly to the sense of anxiety, and the carrot broadly to

  5. Graphite Oxidation Simulation in HTR Accident Conditions

    Energy Technology Data Exchange (ETDEWEB)

    El-Genk, Mohamed

    2012-10-19

    Massive air and water ingress, following a pipe break or leak in steam-generator tubes, is a design-basis accident for high-temperature reactors (HTRs). Analysis of these accidents in both prismatic and pebble bed HTRs requires state-of-the-art capability for predictions of: 1) oxidation kinetics, 2) air helium gas mixture stratification and diffusion into the core following the depressurization, 3) transport of multi-species gas mixture, and 4) graphite corrosion. This project will develop a multi-dimensional, comprehensive oxidation kinetics model of graphite in HTRs, with diverse capabilities for handling different flow regimes. The chemical kinetics/multi-species transport model for graphite burning and oxidation will account for temperature-related changes in the properties of graphite, oxidants (O2, H2O, CO), reaction products (CO, CO2, H2, CH4) and other gases in the mixture (He and N2). The model will treat the oxidation and corrosion of graphite in geometries representative of HTR core component at temperatures of 900°C or higher. The developed chemical reaction kinetics model will be user-friendly for coupling to full core analysis codes such as MELCOR and RELAP, as well as computational fluid dynamics (CFD) codes such as CD-adapco. The research team will solve governing equations for the multi-dimensional flow and the chemical reactions and kinetics using Simulink, an extension of the MATLAB solver, and will validate and benchmark the model's predictions using reported experimental data. Researchers will develop an interface to couple the validated model to a commercially available CFD fluid flow and thermal-hydraulic model of the reactor , and will perform a simulation of a pipe break in a prismatic core HTR, with the potential for future application to a pebble-bed type HTR.

  6. Comparing model predictions for ecosystem-based management

    DEFF Research Database (Denmark)

    Jacobsen, Nis Sand; Essington, Timothy E.; Andersen, Ken Haste

    2016-01-01

    E)) and a size-structured fish community model. The models were compared with respect to predicted ecological consequences of fishing to identify commonalities and differences in model predictions for the California Current fish community. We compared the models regarding direct and indirect responses to fishing...... on one or more species. The size-based model predicted a higher fishing mortality needed to reach maximum sustainable yield than EwE for most species. The size-based model also predicted stronger top-down effects of predator removals than EwE. In contrast, EwE predicted stronger bottom-up effects...... of forage fisheries removal. In both cases, the differences are due to the presumed degree of trophic overlap between juveniles of large-bodied fish and adult stages of forage fish. These differences highlight how each model’s emphasis on distinct details of ecological processes affects its predictions...

  7. The role of personality traits and driving experience in self-reported risky driving behaviors and accident risk among Chinese drivers.

    Science.gov (United States)

    Tao, Da; Zhang, Rui; Qu, Xingda

    2017-02-01

    The purpose of this study was to explore the role of personality traits and driving experience in the prediction of risky driving behaviors and accident risk among Chinese population. A convenience sample of drivers (n=511; mean (SD) age=34.2 (8.8) years) completed a self-report questionnaire that was designed based on validated scales for measuring personality traits, risky driving behaviors and self-reported accident risk. Results from structural equation modeling analysis demonstrated that the data fit well with our theoretical model. While showing no direct effects on accident risk, personality traits had direct effects on risky driving behaviors, and yielded indirect effects on accident risk mediated by risky driving behaviors. Both driving experience and risky driving behaviors directly predicted accident risk and accounted for 15% of its variance. There was little gender difference in personality traits, risky driving behaviors and accident risk. The findings emphasized the importance of personality traits and driving experience in the understanding of risky driving behaviors and accident risk among Chinese drivers and provided new insight into the design of evidence-based driving education and accident prevention interventions.

  8. Loss of vacuum accident (LOVA): Comparison of computational fluid dynamics (CFD) flow velocities against experimental data for the model validation

    Energy Technology Data Exchange (ETDEWEB)

    Bellecci, C.; Gaudio, P.; Lupelli, I. [Faculty of Engineering, University of Rome ' Tor Vergata' , Via del Politecnico 1, 00133 Rome (Italy); Malizia, A., E-mail: malizia@ing.uniroma2.it [Faculty of Engineering, University of Rome ' Tor Vergata' , Via del Politecnico 1, 00133 Rome (Italy); Porfiri, M.T. [ENEA Nuclear Fusion Technologies, Via Enrico Fermi 45 I, 00044, Frascati (Italy); Quaranta, R.; Richetta, M. [Faculty of Engineering, University of Rome ' Tor Vergata' , Via del Politecnico 1, 00133 Rome (Italy)

    2011-06-15

    A recognized safety issue for future fusion reactors fueled with deuterium and tritium is the generation of sizeable quantities of dust. Several mechanisms resulting from material response to plasma bombardment in normal and off-normal conditions are responsible for generating dust of micron and sub-micron length scales inside the VV (Vacuum Vessel) of experimental fusion facilities. The loss of coolant accidents (LOCA), loss of coolant flow accidents (LOFA) and loss of vacuum accidents (LOVA) are types of accidents, expected in experimental fusion reactors like ITER, that may jeopardize components and plasma vessel integrity and cause dust mobilization risky for workers and public. The air velocity is the driven parameter for dust resuspension and its characterization, in the very first phase of the accidents, is critical for the dust release. To study the air velocity trend a small facility, Small Tank for Aerosol Removal and Dust (STARDUST), was set up at the University of Rome 'Tor Vergata', in collaboration with ENEA Frascati laboratories. It simulates a low pressurization rate (300 Pa/s) LOVA event in ITER due to a small air inlet from two different positions of the leak: at the equatorial port level and at the divertor port level. The velocity magnitude in STARDUST was investigated in order to map the velocity field by means of a punctual capacitive transducer placed inside STARDUST without obstacles. FLUENT was used to simulate the flow behavior for the same LOVA scenarios used during the experimental tests. The results of these simulations were compared against the experimental data for CFD code validation. For validation purposes, the CFD simulation data were extracted at the same locations as the experimental data were collected for the first four seconds, because at the beginning of the experiments the maximum velocity values (that could cause the almost complete dust mobilization) have been measured. In this paper the authors present and

  9. A prediction model for assessing residential radon concentration in Switzerland

    NARCIS (Netherlands)

    Hauri, D.D.; Huss, A.; Zimmermann, F.; Kuehni, C.E.; Roosli, M.

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the

  10. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  11. Distributional Analysis for Model Predictive Deferrable Load Control

    OpenAIRE

    Chen, Niangjun; Gan, Lingwen; Low, Steven H.; Wierman, Adam

    2014-01-01

    Deferrable load control is essential for handling the uncertainties associated with the increasing penetration of renewable generation. Model predictive control has emerged as an effective approach for deferrable load control, and has received considerable attention. In particular, previous work has analyzed the average-case performance of model predictive deferrable load control. However, to this point, distributional analysis of model predictive deferrable load control has been elusive. In ...

  12. Predicting the Yield Stress of SCC using Materials Modelling

    DEFF Research Database (Denmark)

    Thrane, Lars Nyholm; Hasholt, Marianne Tange; Pade, Claus

    2005-01-01

    A conceptual model for predicting the Bingham rheological parameter yield stress of SCC has been established. The model used here is inspired by previous work of Oh et al. (1), predicting that the yield stress of concrete relative to the yield stress of paste is a function of the relative thickness...... and distribution were varied between SCC types. The results indicate that yield stress of SCC may be predicted using the model....

  13. Predictive modeling of dental pain using neural network.

    Science.gov (United States)

    Kim, Eun Yeob; Lim, Kun Ok; Rhee, Hyun Sill

    2009-01-01

    The mouth is a part of the body for ingesting food that is the most basic foundation and important part. The dental pain predicted by the neural network model. As a result of making a predictive modeling, the fitness of the predictive modeling of dental pain factors was 80.0%. As for the people who are likely to experience dental pain predicted by the neural network model, preventive measures including proper eating habits, education on oral hygiene, and stress release must precede any dental treatment.

  14. Simulation of the Lower Head Boiling Water Reactor Vessel in a Severe Accident

    Directory of Open Access Journals (Sweden)

    Alejandro Nuñez-Carrera

    2012-01-01

    Full Text Available The objective of this paper is the simulation and analysis of the BoilingWater Reactor (BWR lower head during a severe accident. The COUPLE computer code was used in this work to model the heatup of the reactor core material that slumps in the lower head of the reactor pressure vessel. The prediction of the lower head failure is an important issue in the severe accidents field, due to the accident progression and the radiological consequences that are completely different with or without the failure of the Reactor Pressure Vessel (RPV. The release of molten material to the primary containment and the possibility of steam explosion may produce the failure of the primary containment with high radiological consequences. Then, it is important to have a detailed model in order to predict the behavior of the reactor vessel lower head in a severe accident. In this paper, a hypothetical simulation of a Loss of Coolant Accident (LOCA with simultaneous loss of off-site power and without injection of cooling water is presented with the proposal to evaluate the temperature distribution and heatup of the lower part of the RPV. The SCDAPSIM/RELAP5 3.2 code was used to build the BWR model and conduct the numerical simulation.

  15. Prediction of peptide bonding affinity: kernel methods for nonlinear modeling

    CERN Document Server

    Bergeron, Charles; Sundling, C Matthew; Krein, Michael; Katt, Bill; Sukumar, Nagamani; Breneman, Curt M; Bennett, Kristin P

    2011-01-01

    This paper presents regression models obtained from a process of blind prediction of peptide binding affinity from provided descriptors for several distinct datasets as part of the 2006 Comparative Evaluation of Prediction Algorithms (COEPRA) contest. This paper finds that kernel partial least squares, a nonlinear partial least squares (PLS) algorithm, outperforms PLS, and that the incorporation of transferable atom equivalent features improves predictive capability.

  16. Ecosystem-based approach for the numerical modelling of {sup 137}Cs transfer to the western North Pacific plankton populations after the Fukushima nuclear power plant accident - New approach for the modelling of radiocesium in pelagic food chain in the Northwestern Pacific after the Fukushima nuclear power plant accident

    Energy Technology Data Exchange (ETDEWEB)

    Belharet, Mokrane [Institut de Radioprotection et de Surete Nucleaire, SESURE/ LERCM, 83507 La Seyne-sur-mer (France); Laboratoire d' Aerologie, 14 av Eduard Belin, 31400 Toulouse (France); Estournel, Claude [Laboratoire d' Aerologie, 14 av Eduard Belin, 31400 Toulouse (France); Charmasson, Sabine [Institut de Radioprotection et de Surete Nucleaire, SESURE/ LERCM, 83507 La Seyne-sur-mer (France)

    2014-07-01

    Huge quantities of radiocesium ({sup 134}Cs and {sup 137}Cs) were released to the coastal northwestern Pacific ocean after the Fukushima nuclear power plant accident, that occurred on 11 March 2011. The resultant radiocesium contamination was quickly transferred to marine biota resulting in elevated cesium levels measured in various organisms sampled over this area. This study aims to understand the mechanism of radiocesium behavior in the marine food chain under this non steady state situation (accident) and to estimate contamination levels in these organisms, by using a new modelling approach based on the coupling of three different models: a model of cesium dispersion in the ocean, an ecosystem model and a radioecological model. The model was run for 2 years (2011 and 2012) and was first applied to study marine plankton and planktonivorous fishes contamination. Results of spatio-temporal radiocesium activities in these different organisms, and calculated parameters like concentration ratios, biological half-lives, percentages of contamination coming from each compartment (food and water), and trophic transfer factors were compared to the observed values acquired before and after the accident (steady and non-steady states). (authors)

  17. Prediction using patient comparison vs. modeling: a case study for mortality prediction.

    Science.gov (United States)

    Hoogendoorn, Mark; El Hassouni, Ali; Mok, Kwongyen; Ghassemi, Marzyeh; Szolovits, Peter

    2016-08-01

    Information in Electronic Medical Records (EMRs) can be used to generate accurate predictions for the occurrence of a variety of health states, which can contribute to more pro-active interventions. The very nature of EMRs does make the application of off-the-shelf machine learning techniques difficult. In this paper, we study two approaches to making predictions that have hardly been compared in the past: (1) extracting high-level (temporal) features from EMRs and building a predictive model, and (2) defining a patient similarity metric and predicting based on the outcome observed for similar patients. We analyze and compare both approaches on the MIMIC-II ICU dataset to predict patient mortality and find that the patient similarity approach does not scale well and results in a less accurate model (AUC of 0.68) compared to the modeling approach (0.84). We also show that mortality can be predicted within a median of 72 hours.

  18. Internal dose assessment due to large area contamination: Main lessons drawn from the Chernobyl accident

    Energy Technology Data Exchange (ETDEWEB)

    Andrasi, A. [KFKI Atomic Energy Research Inst., Budapest (Hungary)

    1997-03-01

    The reactor accident at Chernobyl in 1986 beside its serious and tragic consequences provided also an excellent opportunity to check, test and validate all kind of environmental models and calculation tools which were available in the emergency preparedness systems of different countries. Assessment of internal and external doses due to the accident has been carried out for the population all over Europe using different methods. Dose predictions based on environmental model calculation considering various pathways have been compared with those obtained by more direct monitoring methods. One study from Hungary and one from the TAEA is presented shortly. (orig./DG)

  19. Fuzzy predictive filtering in nonlinear economic model predictive control for demand response

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.;

    2016-01-01

    The performance of a model predictive controller (MPC) is highly correlated with the model's accuracy. This paper introduces an economic model predictive control (EMPC) scheme based on a nonlinear model, which uses a branch-and-bound tree search for solving the inherent non-convex optimization...... problem. Moreover, to reduce the computation time and improve the controller's performance, a fuzzy predictive filter is introduced. With the purpose of testing the developed EMPC, a simulation controlling the temperature levels of an intelligent office building (PowerFlexHouse), with and without fuzzy...

  20. Predictive modeling and reducing cyclic variability in autoignition engines

    Energy Technology Data Exchange (ETDEWEB)

    Hellstrom, Erik; Stefanopoulou, Anna; Jiang, Li; Larimore, Jacob

    2016-08-30

    Methods and systems are provided for controlling a vehicle engine to reduce cycle-to-cycle combustion variation. A predictive model is applied to predict cycle-to-cycle combustion behavior of an engine based on observed engine performance variables. Conditions are identified, based on the predicted cycle-to-cycle combustion behavior, that indicate high cycle-to-cycle combustion variation. Corrective measures are then applied to prevent the predicted high cycle-to-cycle combustion variation.

  1. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    Science.gov (United States)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  2. Detailed source term estimation of the atmospheric release for the Fukushima Daiichi Nuclear Power Station accident by coupling simulations of atmospheric dispersion model with improved deposition scheme and oceanic dispersion model

    Directory of Open Access Journals (Sweden)

    G. Katata

    2014-06-01

    Full Text Available Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Dai-ichi Nuclear Power Station (FNPS1 accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate a detailed time trend of atmospheric releases during the accident by combining environmental monitoring data with atmospheric model simulations from WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information, and simulations from the oceanic dispersion model SEA-GEARN-FDM, both developed by the authors. A sophisticated deposition scheme, which deals with dry and fogwater depositions, cloud condensation nuclei (CCN activation and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging for radioactive iodine gas (I2 and CH3I and other particles (CsI, Cs, and Te, was incorporated into WSPEEDI-II to improve the surface deposition calculations. The fallout to the ocean surface calculated by WSPEEDI-II was used as input data for the SEA-GEARN-FDM calculations. Reverse and inverse source-term estimation methods based on coupling the simulations from both models was adopted using air dose rates and concentrations, and sea surface concentrations. The results revealed that the major releases of radionuclides due to FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, the morning of 13 March after the venting event at Unit 3, midnight of 14 March when the SRV (Safely Relief Valve at Unit 2 was opened three times, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal

  3. Detailed source term estimation of the atmospheric release for the Fukushima Daiichi Nuclear Power Station accident by coupling simulations of atmospheric dispersion model with improved deposition scheme and oceanic dispersion model

    Science.gov (United States)

    Katata, G.; Chino, M.; Kobayashi, T.; Terada, H.; Ota, M.; Nagai, H.; Kajino, M.; Draxler, R.; Hort, M. C.; Malo, A.; Torii, T.; Sanada, Y.

    2014-06-01

    Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Dai-ichi Nuclear Power Station (FNPS1) accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate a detailed time trend of atmospheric releases during the accident by combining environmental monitoring data with atmospheric model simulations from WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information), and simulations from the oceanic dispersion model SEA-GEARN-FDM, both developed by the authors. A sophisticated deposition scheme, which deals with dry and fogwater depositions, cloud condensation nuclei (CCN) activation and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging) for radioactive iodine gas (I2 and CH3I) and other particles (CsI, Cs, and Te), was incorporated into WSPEEDI-II to improve the surface deposition calculations. The fallout to the ocean surface calculated by WSPEEDI-II was used as input data for the SEA-GEARN-FDM calculations. Reverse and inverse source-term estimation methods based on coupling the simulations from both models was adopted using air dose rates and concentrations, and sea surface concentrations. The results revealed that the major releases of radionuclides due to FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, the morning of 13 March after the venting event at Unit 3, midnight of 14 March when the SRV (Safely Relief Valve) at Unit 2 was opened three times, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal variation of

  4. Atmospheric removal times of the aerosol-bound radionuclides 137Cs and 131I during the months after the Fukushima Dai-ichi nuclear power plant accident – a constraint for air quality and climate models

    Directory of Open Access Journals (Sweden)

    G. Wotawa

    2012-05-01

    Full Text Available Caesium-137 (137Cs and iodine-131 (131I are radionuclides of particular concern during nuclear accidents, because they are emitted in large amounts and are of significant health impact. 137Cs and 131I attach to the ambient accumulation-mode (AM aerosols and share their fate as the aerosols are removed from the atmosphere by scavenging within clouds, precipitation and dry deposition. Here, we estimate their removal times from the atmosphere using a unique high-precision global measurement data set collected over several months after the accident at the Fukushima Dai-ichi nuclear power plant in March 2011. The noble gas xenon-133 (133Xe, also released during the accident, served as a passive tracer of air mass transport for determining the removal times of 137Cs and 131I via the decrease in the measured ratios 137Cs/133Xe and 131I/133Xe over time. After correction for radioactive decay, the 137Cs/133Xe ratios reflect the removal of aerosols by wet and dry deposition, whereas the 131I/133Xe ratios are also influenced by aerosol production from gaseous 131I. We find removal times for 137Cs of 10.0–13.9 days and for 131I of 17.1–24.2 days during April and May 2011. We discuss possible caveats (e.g. late emissions, resuspension that can affect the results, and compare the 137Cs removal times with observation-based and modeled aerosol lifetimes. Our 137Cs removal time of 10.0–13.9 days should be representative of a "background" AM aerosol well mixed in the extratropical Northern Hemisphere troposphere. It is expected that the lifetime of this vertically mixed background aerosol is longer than the lifetime of AM aerosols originating from surface sources. However, the substantial difference to the mean lifetimes of AM aerosols obtained from aerosol models, typically in the range of 3–7 days, warrants further research on the cause of this discrepancy. Too short modeled AM aerosol lifetimes would have serious implications for air quality and

  5. Atmospheric removal times of the aerosol-bound radionuclides 137Cs and 131I measured after the Fukushima Dai-ichi nuclear accident – a constraint for air quality and climate models

    Directory of Open Access Journals (Sweden)

    G. Wotawa

    2012-11-01

    Full Text Available Caesium-137 (137Cs and iodine-131 (131I are radionuclides of particular concern during nuclear accidents, because they are emitted in large amounts and are of significant health impact. 137Cs and 131I attach to the ambient accumulation-mode (AM aerosols and share their fate as the aerosols are removed from the atmosphere by scavenging within clouds, precipitation and dry deposition. Here, we estimate their removal times from the atmosphere using a unique high-precision global measurement data set collected over several months after the accident at the Fukushima Dai-ichi nuclear power plant in March 2011. The noble gas xenon-133 (133Xe, also released during the accident, served as a passive tracer of air mass transport for determining the removal times of 137Cs and 131I via the decrease in the measured ratios 137Cs/133Xe and 131I/133Xe over time. After correction for radioactive decay, the 137Cs/133Xe ratios reflect the removal of aerosols by wet and dry deposition, whereas the 131I/133Xe ratios are also influenced by aerosol production from gaseous 131I. We find removal times for 137Cs of 10.0–13.9 days and for 131I of 17.1–24.2 days during April and May 2011. The removal time of 131I is longer due to the aerosol production from gaseous 131I, thus the removal time for 137Cs serves as a better estimate for aerosol lifetime. The removal time of 131I is of interest for semi-volatile species. We discuss possible caveats (e.g. late emissions, resuspension that can affect the results, and compare the 137Cs removal times with observation-based and modeled aerosol lifetimes. Our 137Cs removal time of 10.0–13.9 days should be representative of a "background" AM aerosol well mixed in the extratropical Northern Hemisphere troposphere. It is expected that the lifetime of this vertically mixed background aerosol is longer than the lifetime of fresh AM aerosols directly emitted from surface sources. However, the substantial difference to the mean

  6. Atmospheric removal times of the aerosol-bound radionuclides 137Cs and 131I measured after the Fukushima Dai-ichi nuclear accident - a constraint for air quality and climate models

    Science.gov (United States)

    Kristiansen, N. I.; Stohl, A.; Wotawa, G.

    2012-11-01

    Caesium-137 (137Cs) and iodine-131 (131I) are radionuclides of particular concern during nuclear accidents, because they are emitted in large amounts and are of significant health impact. 137Cs and 131I attach to the ambient accumulation-mode (AM) aerosols and share their fate as the aerosols are removed from the atmosphere by scavenging within clouds, precipitation and dry deposition. Here, we estimate their removal times from the atmosphere using a unique high-precision global measurement data set collected over several months after the accident at the Fukushima Dai-ichi nuclear power plant in March 2011. The noble gas xenon-133 (133Xe), also released during the accident, served as a passive tracer of air mass transport for determining the removal times of 137Cs and 131I via the decrease in the measured ratios 137Cs/133Xe and 131I/133Xe over time. After correction for radioactive decay, the 137Cs/133Xe ratios reflect the removal of aerosols by wet and dry deposition, whereas the 131I/133Xe ratios are also influenced by aerosol production from gaseous 131I. We find removal times for 137Cs of 10.0-13.9 days and for 131I of 17.1-24.2 days during April and May 2011. The removal time of 131I is longer due to the aerosol production from gaseous 131I, thus the removal time for 137Cs serves as a better estimate for aerosol lifetime. The removal time of 131I is of interest for semi-volatile species. We discuss possible caveats (e.g. late emissions, resuspension) that can affect the results, and compare the 137Cs removal times with observation-based and modeled aerosol lifetimes. Our 137Cs removal time of 10.0-13.9 days should be representative of a "background" AM aerosol well mixed in the extratropical Northern Hemisphere troposphere. It is expected that the lifetime of this vertically mixed background aerosol is longer than the lifetime of fresh AM aerosols directly emitted from surface sources. However, the substantial difference to the mean lifetimes of AM aerosols

  7. Atmospheric removal times of the aerosol-bound radionuclides 137Cs and 131I during the months after the Fukushima Dai-ichi nuclear power plant accident - a constraint for air quality and climate models

    Science.gov (United States)

    Kristiansen, N. I.; Stohl, A.; Wotawa, G.

    2012-05-01

    Caesium-137 (137Cs) and iodine-131 (131I) are radionuclides of particular concern during nuclear accidents, because they are emitted in large amounts and are of significant health impact. 137Cs and 131I attach to the ambient accumulation-mode (AM) aerosols and share their fate as the aerosols are removed from the atmosphere by scavenging within clouds, precipitation and dry deposition. Here, we estimate their removal times from the atmosphere using a unique high-precision global measurement data set collected over several months after the accident at the Fukushima Dai-ichi nuclear power plant in March 2011. The noble gas xenon-133 (133Xe), also released during the accident, served as a passive tracer of air mass transport for determining the removal times of 137Cs and 131I via the decrease in the measured ratios 137Cs/133Xe and 131I/133Xe over time. After correction for radioactive decay, the 137Cs/133Xe ratios reflect the removal of aerosols by wet and dry deposition, whereas the 131I/133Xe ratios are also influenced by aerosol production from gaseous 131I. We find removal times for 137Cs of 10.0-13.9 days and for 131I of 17.1-24.2 days during April and May 2011. We discuss possible caveats (e.g. late emissions, resuspension) that can affect the results, and compare the 137Cs removal times with observation-based and modeled aerosol lifetimes. Our 137Cs removal time of 10.0-13.9 days should be representative of a "background" AM aerosol well mixed in the extratropical Northern Hemisphere troposphere. It is expected that the lifetime of this vertically mixed background aerosol is longer than the lifetime of AM aerosols originating from surface sources. However, the substantial difference to the mean lifetimes of AM aerosols obtained from aerosol models, typically in the range of 3-7 days, warrants further research on the cause of this discrepancy. Too short modeled AM aerosol lifetimes would have serious implications for air quality and climate model predictions.

  8. Organizational safety climate and supervisor safety enforcement: Multilevel explorations of the causes of accident underreporting.

    Science.gov (United States)

    Probst, Tahira M

    2015-11-01

    According to national surveillance statistics, over 3 million employees are injured each year; yet, research indicates that these may be substantial underestimates of the true prevalence. The purpose of the current project was to empirically test the hypothesis that organizational safety climate and transactional supervisor safety leadership would predict the extent to which accidents go unreported by employees. Using hierarchical linear modeling and survey data collected from 1,238 employees in 33 organizations, employee-level supervisor safety enforcement behaviors (and to a less consistent extent, organizational-level safety climate) predicted employee accident underreporting. There was also a significant cross-level interaction, such that the effect of supervisor enforcement on underreporting was attenuated in organizations with a positive safety climate. These results may benefit human resources and safety professionals by pinpointing methods of increasing the accuracy of accident reporting, reducing actual safety incidents, and reducing the costs to individuals and organizations that result from underreporting.

  9. Intelligent predictive model of ventilating capacity of imperial smelt furnace

    Institute of Scientific and Technical Information of China (English)

    唐朝晖; 胡燕瑜; 桂卫华; 吴敏

    2003-01-01

    In order to know the ventilating capacity of imperial smelt furnace (ISF), and increase the output of plumbum, an intelligent modeling method based on gray theory and artificial neural networks(ANN) is proposed, in which the weight values in the integrated model can be adjusted automatically. An intelligent predictive model of the ventilating capacity of the ISF is established and analyzed by the method. The simulation results and industrial applications demonstrate that the predictive model is close to the real plant, the relative predictive error is 0.72%, which is 50% less than the single model, leading to a notable increase of the output of plumbum.

  10. Adaptation of Predictive Models to PDA Hand-Held Devices

    Directory of Open Access Journals (Sweden)

    Lin, Edward J

    2008-01-01

    Full Text Available Prediction models using multiple logistic regression are appearing with increasing frequency in the medical literature. Problems associated with these models include the complexity of computations when applied in their pure form, and lack of availability at the bedside. Personal digital assistant (PDA hand-held devices equipped with spreadsheet software offer the clinician a readily available and easily applied means of applying predictive models at the bedside. The purposes of this article are to briefly review regression as a means of creating predictive models and to describe a method of choosing and adapting logistic regression models to emergency department (ED clinical practice.

  11. A Prediction Model of the Capillary Pressure J-Function

    Science.gov (United States)

    Xu, W. S.; Luo, P. Y.; Sun, L.; Lin, N.

    2016-01-01

    The capillary pressure J-function is a dimensionless measure of the capillary pressure of a fluid in a porous medium. The function was derived based on a capillary bundle model. However, the dependence of the J-function on the saturation Sw is not well understood. A prediction model for it is presented based on capillary pressure model, and the J-function prediction model is a power function instead of an exponential or polynomial function. Relative permeability is calculated with the J-function prediction model, resulting in an easier calculation and results that are more representative. PMID:27603701

  12. A model to predict the power output from wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L. [Riso National Lab., Roskilde (Denmark)

    1997-12-31

    This paper will describe a model that can predict the power output from wind farms. To give examples of input the model is applied to a wind farm in Texas. The predictions are generated from forecasts from the NGM model of NCEP. These predictions are made valid at individual sites (wind farms) by applying a matrix calculated by the sub-models of WASP (Wind Atlas Application and Analysis Program). The actual wind farm production is calculated using the Riso PARK model. Because of the preliminary nature of the results, they will not be given. However, similar results from Europe will be given.

  13. Modelling microbial interactions and food structure in predictive microbiology

    NARCIS (Netherlands)

    Malakar, P.K.

    2002-01-01

    Keywords: modelling, dynamic models, microbial interactions, diffusion, microgradients, colony growth, predictive microbiology.    Growth response of microorganisms in foods is a complex process. Innovations in food production and preservation techniques have resulted in adoption of new technologies

  14. Bow-tie model for offshore drilling blowout accident%海上钻井井喷事故的蝴蝶结模型

    Institute of Scientific and Technical Information of China (English)

    薛鲁宁; 樊建春; 张来斌

    2013-01-01

    海上钻井是一个复杂的动态系统,同时又与一般的过程工业一样,可以划分为清晰的操作步骤和流程,因此安全屏障理论的过程模型非常适合分析海上钻井作业的安全.井喷是海上钻井作业的重要威胁,井喷事故模型对于指导海上钻井安全有重要意义.以安全屏障为基础,利用事故树和事件树分析方法,建立了海上钻井井喷事故的蝴蝶结模型.利用事故树方法分析了井喷事件的原因,通过事件树方法分析了井喷发生后火灾爆炸事故的发展过程.通过将事件原因及事故后果综合为一个模型,操作者可以很直观地了解井喷事故的发生发展过程,进而为其寻找相应的预防及控制措施提供指导.最后将“深水地平线”事故应用于该模型,验证了该模型分析海上钻井井喷事故的有效性.%Offshore drilling is a complex and dynamic system. Meanwhile, like any other process industry, it can also be divided into several independent operating steps and procedures. Therefore, the process model of safety barrier is very suitable to analyze safety of offshore drilling operations. Blowout was an important threat to offshore drilling. A blowout accident model is very meaningful to instruct offshore drilling safety. Based on safety barrier theory, a bow-tie model for offshore drilling blowout was established by utilizing fault tree and event tree methods. Causes of an offshore drilling blowout event were analyzed by fault tree, while development processes of fire & explosion accident after a blowout event were analyzed by event tree. Causes of an offshore drilling blowout event and consequences of fire & explosion accident after the blowout were combined into a single model. This model is very convenient for operators to understand the whole generating and developing process of an offshore drilling blowout accident. Therefore, it can be used as a guide for offshore drilling operators to find relevant

  15. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  16. Predicting Career Advancement with Structural Equation Modelling

    Science.gov (United States)

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  17. Modeling and prediction of surgical procedure times

    NARCIS (Netherlands)

    P.S. Stepaniak (Pieter); C. Heij (Christiaan); G. de Vries (Guus)

    2009-01-01

    textabstractAccurate prediction of medical operation times is of crucial importance for cost efficient operation room planning in hospitals. This paper investigates the possible dependence of procedure times on surgeon factors like age, experience, gender, and team composition. The effect of these f

  18. Active diagnosis of hybrid systems - A model predictive approach

    OpenAIRE

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeate...

  19. Evaluation of Fast-Time Wake Vortex Prediction Models

    Science.gov (United States)

    Proctor, Fred H.; Hamilton, David W.

    2009-01-01

    Current fast-time wake models are reviewed and three basic types are defined. Predictions from several of the fast-time models are compared. Previous statistical evaluations of the APA-Sarpkaya and D2P fast-time models are discussed. Root Mean Square errors between fast-time model predictions and Lidar wake measurements are examined for a 24 hr period at Denver International Airport. Shortcomings in current methodology for evaluating wake errors are also discussed.

  20. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  1. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  2. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “any fall” and “recurrent falls.” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  3. Review of methodology for accident consequence assessment

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D.L.; Soldat, J.K.; Watson, E.C.

    1978-09-01

    This report reviews current methodologies for reactor accident consequence analysis and describes areas where modifications are warranted. Methodologies reviewed are: (1) Models in Regulatory Guides 1.109, 1.111 and 1.113 used for evaluation of compliance with 10 CFR 50 Appendix I; (2) Models in Regulatory Guides used for evaluation of consequences from accidents of Classes 3-8; (3) Models for evaluation of Class 9 accidents presented in the Reactor Safety Study; and (4) Models in the Liquid Pathway Generic Study. The review is designed to aid in the ultimate goal of selection of a comprehensive set of models to extend the Class 9 methodology of the Reactor Safety Study to the analysis of Classes 3-8 accidents.

  4. Econometric models for predicting confusion crop ratios

    Science.gov (United States)

    Umberger, D. E.; Proctor, M. H.; Clark, J. E.; Eisgruber, L. M.; Braschler, C. B. (Principal Investigator)

    1979-01-01

    Results for both the United States and Canada show that econometric models can provide estimates of confusion crop ratios that are more accurate than historical ratios. Whether these models can support the LACIE 90/90 accuracy criterion is uncertain. In the United States, experimenting with additional model formulations could provide improved methods models in some CRD's, particularly in winter wheat. Improved models may also be possible for the Canadian CD's. The more aggressive province/state models outperformed individual CD/CRD models. This result was expected partly because acreage statistics are based on sampling procedures, and the sampling precision declines from the province/state to the CD/CRD level. Declining sampling precision and the need to substitute province/state data for the CD/CRD data introduced measurement error into the CD/CRD models.

  5. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  6. Fukushima nuclear power plant accident was preventable

    Science.gov (United States)

    Kanoglu, Utku; Synolakis, Costas

    2015-04-01

    , insufficient attention was paid to evidence of large tsunamis inundating the region, i.e., AD 869 Jogan and 1677 Empo Boso-oki tsunamis, and the 1896 Sanriku tsunami maximum height in eastern Japan whose maximum runup was 38m. Two, the design safety conditions were different in Onagawa, Fukushima and Tokai NPPs. It is inconceivable to have had different earthquake scenarios for the NPPs at such close distance from each other. Three, studying the sub-standard TEPCO analysis performed only months before the accident shows that it is not the accuracy of numerical computations or the veracity of the computational model that doomed the NPP, but the lack of familiarity with the context of numerical predictions. Inundation projections, even if correct for one particular scenario, need to always be put in context of similar studies and events elsewhere. To put it in colloquial terms, following a recipe from a great cookbook and having great cookware does not always result in great food, if the cook is an amateur. The Fukushima accident was preventable. Had the plant's owner TEPCO and NISA followed international best practices and standards, they would had predicted the possibility of the plant being struck by the size of tsunami that materialized in 2011. If the EDGs had been relocated inland or higher, there would have been no loss of power. A clear chance to have reduced the impact of the tsunami at Fukushima was lost after the 2010 Chilean tsunami. Standards are not only needed for evaluating the vulnerability of NPPs against tsunami attack, but also for evaluating the competence of modelers and evaluators. Acknowledgment: This work is partially supported by the project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe) FP7-ENV2013 6.4-3, Grant 603839 to the Technical University of Crete and the Middle East Technical University.

  7. Laser accidents: Being Prepared

    Energy Technology Data Exchange (ETDEWEB)

    Barat, K

    2003-01-24

    The goal of the Laser Safety Officer and any laser safety program is to prevent a laser accident from occurring, in particular an injury to a person's eyes. Most laser safety courses talk about laser accidents, causes, and types of injury. The purpose of this presentation is to present a plan for safety offices and users to follow in case of accident or injury from laser radiation.

  8. MULTI MODEL DATA MINING APPROACH FOR HEART FAILURE PREDICTION

    Directory of Open Access Journals (Sweden)

    Priyanka H U

    2016-09-01

    Full Text Available Developing predictive modelling solutions for risk estimation is extremely challenging in health-care informatics. Risk estimation involves integration of heterogeneous clinical sources having different representation from different health-care provider making the task increasingly complex. Such sources are typically voluminous, diverse, and significantly change over the time. Therefore, distributed and parallel computing tools collectively termed big data tools are in need which can synthesize and assist the physician to make right clinical decisions. In this work we propose multi-model predictive architecture, a novel approach for combining the predictive ability of multiple models for better prediction accuracy. We demonstrate the effectiveness and efficiency of the proposed work on data from Framingham Heart study. Results show that the proposed multi-model predictive architecture is able to provide better accuracy than best model approach. By modelling the error of predictive models we are able to choose sub set of models which yields accurate results. More information was modelled into system by multi-level mining which has resulted in enhanced predictive accuracy.

  9. The regional prediction model of PM10 concentrations for Turkey

    Science.gov (United States)

    Güler, Nevin; Güneri İşçi, Öznur

    2016-11-01

    This study is aimed to predict a regional model for weekly PM10 concentrations measured air pollution monitoring stations in Turkey. There are seven geographical regions in Turkey and numerous monitoring stations at each region. Predicting a model conventionally for each monitoring station requires a lot of labor and time and it may lead to degradation in quality of prediction when the number of measurements obtained from any õmonitoring station is small. Besides, prediction models obtained by this way only reflect the air pollutant behavior of a small area. This study uses Fuzzy C-Auto Regressive Model (FCARM) in order to find a prediction model to be reflected the regional behavior of weekly PM10 concentrations. The superiority of FCARM is to have the ability of considering simultaneously PM10 concentrations measured monitoring stations in the specified region. Besides, it also works even if the number of measurements obtained from the monitoring stations is different or small. In order to evaluate the performance of FCARM, FCARM is executed for all regions in Turkey and prediction results are compared to statistical Autoregressive (AR) Models predicted for each station separately. According to Mean Absolute Percentage Error (MAPE) criteria, it is observed that FCARM provides the better predictions with a less number of models.

  10. Gaussian mixture models as flux prediction method for central receivers

    Science.gov (United States)

    Grobler, Annemarie; Gauché, Paul; Smit, Willie

    2016-05-01

    Flux prediction methods are crucial to the design and operation of central receiver systems. Current methods such as the circular and elliptical (bivariate) Gaussian prediction methods are often used in field layout design and aiming strategies. For experimental or small central receiver systems, the flux profile of a single heliostat often deviates significantly from the circular and elliptical Gaussian models. Therefore a novel method of flux prediction was developed by incorporating the fitting of Gaussian mixture models onto flux profiles produced by flux measurement or ray tracing. A method was also developed to predict the Gaussian mixture model parameters of a single heliostat for a given time using image processing. Recording the predicted parameters in a database ensures that more accurate predictions are made in a shorter time frame.

  11. [Accidents with the "paraglider"].

    Science.gov (United States)

    Lang, T H; Dengg, C; Gabl, M

    1988-09-01

    With a collective of 46 patients we show the details and kinds of accidents caused by paragliding. The base for the casuistry of the accidents was a questionnaire which was answered by most of the injured persons. These were questions about the theoretical and practical training, the course of the flight during the different phases, and the subjective point of view of the course of the accident. The patterns of the injuries showed a high incidence of injuries of the spinal column and high risks for the ankles. At the end, we give some advice how to prevent these accidents.

  12. Application of Nonlinear Predictive Control Based on RBF Network Predictive Model in MCFC Plant

    Institute of Scientific and Technical Information of China (English)

    CHEN Yue-hua; CAO Guang-yi; ZHU Xin-jian

    2007-01-01

    This paper described a nonlinear model predictive controller for regulating a molten carbonate fuel cell (MCFC). A detailed mechanism model of output voltage of a MCFC was presented at first. However, this model was too complicated to be used in a control system. Consequently, an off line radial basis function (RBF) network was introduced to build a nonlinear predictive model. And then, the optimal control sequences were obtained by applying golden mean method. The models and controller have been realized in the MATLAB environment. Simulation results indicate the proposed algorithm exhibits satisfying control effect even when the current densities vary largely.

  13. Development of Methodology for Spent Fuel Pool Severe Accident Analysis Using MELCOR Program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won-Tae; Shin, Jae-Uk [RETech. Co. LTD., Yongin (Korea, Republic of); Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The general reason why SFP severe accident analysis has to be considered is that there is a potential great risk due to the huge number of fuel assemblies and no containment in a SFP building. In most cases, the SFP building is vulnerable to external damage or attack. In contrary, low decay heat of fuel assemblies may make the accident processes slow compared to the accident in reactor core because of a great deal of water. In short, its severity of consequence cannot exclude the consideration of SFP risk management. The U.S. Nuclear Regulatory Commission has performed the consequence studies of postulated spent fuel pool accident. The Fukushima-Daiichi accident has accelerated the needs for the consequence studies of postulated spent fuel pool accidents, causing the nuclear industry and regulatory bodies to reexamine several assumptions concerning beyond-design basis events such as a station blackout. The tsunami brought about the loss of coolant accident, leading to the explosion of hydrogen in the SFP building. Analyses of SFP accident processes in the case of a loss of coolant with no heat removal have studied. Few studies however have focused on a long term process of SFP severe accident under no mitigation action such as a water makeup to SFP. USNRC and OECD have co-worked to examine the behavior of PWR fuel assemblies under severe accident conditions in a spent fuel rack. In support of the investigation, several new features of MELCOR model have been added to simulate both BWR fuel assembly and PWR 17 x 17 assembly in a spent fuel pool rack undergoing severe accident conditions. The purpose of the study in this paper is to develop a methodology of the long-term analysis for the plant level SFP severe accident by using the new-featured MELCOR program in the OPR-1000 Nuclear Power Plant. The study is to investigate the ability of MELCOR in predicting an entire process of SFP severe accident phenomena including the molten corium and concrete reaction. The

  14. Nonlinear model predictive control of a packed distillation column

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.A.; Edgar, T.F. (Univ. of Texas, Austin, TX (United States). Dept. of Chemical Engineering)

    1993-10-01

    A rigorous dynamic model based on fundamental chemical engineering principles was formulated for a packed distillation column separating a mixture of cyclohexane and n-heptane. This model was simplified to a form suitable for use in on-line model predictive control calculations. A packed distillation column was operated at several operating conditions to estimate two unknown model parameters in the rigorous and simplified models. The actual column response to step changes in the feed rate, distillate rate, and reboiler duty agreed well with dynamic model predictions. One unusual characteristic observed was that the packed column exhibited gain-sign changes, which are very difficult to treat using conventional linear feedback control. Nonlinear model predictive control was used to control the distillation column at an operating condition where the process gain changed sign. An on-line, nonlinear model-based scheme was used to estimate unknown/time-varying model parameters.

  15. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    T. Wu; E. Lester; M. Cloke [University of Nottingham, Nottingham (United Kingdom). Nottingham Energy and Fuel Centre

    2005-07-01

    Poor burnout in a coal-fired power plant has marked penalties in the form of reduced energy efficiency and elevated waste material that can not be utilized. The prediction of coal combustion behaviour in a furnace is of great significance in providing valuable information not only for process optimization but also for coal buyers in the international market. Coal combustion models have been developed that can make predictions about burnout behaviour and burnout potential. Most of these kinetic models require standard parameters such as volatile content, particle size and assumed char porosity in order to make a burnout prediction. This paper presents a new model called the Char Burnout Model (ChB) that also uses detailed information about char morphology in its prediction. The model can use data input from one of two sources. Both sources are derived from image analysis techniques. The first from individual analysis and characterization of real char types using an automated program. The second from predicted char types based on data collected during the automated image analysis of coal particles. Modelling results were compared with a different carbon burnout kinetic model and burnout data from re-firing the chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen across several residence times. An improved agreement between ChB model and DTF experimental data proved that the inclusion of char morphology in combustion models can improve model predictions. 27 refs., 4 figs., 4 tabs.

  16. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel;

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...... day (using the area under the receiver operating characteristic curve (AUC) and kappa statistics) and by assessing consistency in predictions of range size changes under future climate (using cluster analysis). Results Our analyses show significant differences between predictions from different models......, with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each...

  17. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan;

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re......-search of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset....... The conferred results show that the prediction errors can be decreased, while the computation time is reduced....

  18. Heat up and potential failure of BWR upper internals during a severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Robb, Kevin R [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    In boiling water reactors, the steam dome, steam separators, and dryers above the core are comprised of approximately 100 tons of stainless steel. During a severe accident in which the coolant boils away and exothermic oxidation of zirconium occurs, gases (steam and hydrogen) are superheated in the core region and pass through the upper internals. Historically, the upper internals have been modeled using severe accident codes with relatively simple approximations. The upper internals are typically modeled in MELCOR as two lumped volumes with simplified heat transfer characteristics, with no structural integrity considerations, and with limited ability to oxidize, melt, and relocate. The potential for and the subsequent impact of the upper internals to heat up, oxidize, fail, and relocate during a severe accident was investigated. A higher fidelity representation of the shroud dome, steam separators, and steam driers was developed in MELCOR v1.8.6 by extending the core region upwards. This modeling effort entailed adding 45 additional core cells and control volumes, 98 flow paths, and numerous control functions. The model accounts for the mechanical loading and structural integrity, oxidation, melting, flow area blockage, and relocation of the various components. The results indicate that the upper internals can reach high temperatures during a severe accident; they are predicted to reach a high enough temperature such that they lose their structural integrity and relocate. The additional 100 tons of stainless steel debris influences the subsequent in-vessel and ex-vessel accident progression.

  19. Model Predictive Control for Smart Energy Systems

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus

    load shifting capabilities of the units that adapts to the given price predictions. We furthermore evaluated control performance in terms of economic savings for different control strategies and forecasts. Chapter 5 describes and compares the proposed large-scale Aggregator control strategies....... Aggregators are assumed to play an important role in the future Smart Grid and coordinate a large portfolio of units. The developed economic MPC controllers interfaces each unit directly to an Aggregator. We developed several MPC-based aggregation strategies that coordinates the global behavior of a portfolio...

  20. Climate predictability and prediction skill on seasonal time scales over South America from CHFP models

    Science.gov (United States)

    Osman, Marisol; Vera, C. S.

    2016-11-01

    This work presents an assessment of the predictability and skill of climate anomalies over South America. The study was made considering a multi-model ensemble of seasonal forecasts for surface air temperature, precipitation and regional circulation, from coupled global circulation models included in the Climate Historical Forecast Project. Predictability was evaluated through the estimation of the signal-to-total variance ratio while prediction skill was assessed computing anomaly correlation coefficients. Both indicators present over the continent higher values at the tropics than at the extratropics for both, surface air temperature and precipitation. Moreover, predictability and prediction skill for temperature are slightly higher in DJF than in JJA while for precipitation they exhibit similar levels in both seasons. The largest values of predictability and skill for both variables and seasons are found over northwestern South America while modest but still significant values for extratropical precipitation at southeastern South America and the extratropical Andes. The predictability levels in ENSO years of both variables are slightly higher, although with the same spatial distribution, than that obtained considering all years. Nevertheless, predictability at the tropics for both variables and seasons diminishes in both warm and cold ENSO years respect to that in all years. The latter can be attributed to changes in signal rather than in the noise. Predictability and prediction skill for low-level winds and upper-level zonal winds over South America was also assessed. Maximum levels of predictability for low-level winds were found were maximum mean values are observed, i.e. the regions associated with the equatorial trade winds, the midlatitudes westerlies and the South American Low-Level Jet. Predictability maxima for upper-level zonal winds locate where the subtropical jet peaks. Seasonal changes in wind predictability are observed that seem to be related to

  1. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  2. Combining logistic regression and neural networks to create predictive models.

    OpenAIRE

    Spackman, K. A.

    1992-01-01

    Neural networks are being used widely in medicine and other areas to create predictive models from data. The statistical method that most closely parallels neural networks is logistic regression. This paper outlines some ways in which neural networks and logistic regression are similar, shows how a small modification of logistic regression can be used in the training of neural network models, and illustrates the use of this modification for variable selection and predictive model building wit...

  3. 我国事故管理研究现状%Research review on unexpected accident management in China

    Institute of Scientific and Technical Information of China (English)

    王臣; 高俊山

    2012-01-01

    security methods aiming to combat such accidents may include: (1) The research and improvement of the regression prediction method, the time prediction method, Markov chain prediction method, the gray prediction method, the Bayesian network prediction method and the neural network prediction method and the ways of how to bring the above methods into prediction practice; (2) Ways to enhance the major hazards management and quantitative risk assessment of prevention and combating against such disastrous accidents; (3) The ways on how to do the Fault Tree Analysis, the Analytic Hierarchy Process and other methods in actual disastrous accidents analysis; (4) the theories, models and methods for emergency response, evacuation and rescue approaches, supplies distribution and the accidents aftermath treatment. It can be thus concluded that the research focuses in the future in the accident management field are still likely to be put on the quantified risk assessment, accident prevention, more practical emergency rescue measures and evacuation strategies.

  4. A THERMODYNAMIC MODEL TO PREDICT WAX FORMATION IN PETROLEUM FLUIDS

    Directory of Open Access Journals (Sweden)

    J.A.P. Coutinho

    2001-12-01

    Full Text Available Some years ago the authors proposed a model for the non-ideality of the solid phase, based on the Predictive Local Composition concept. This was first applied to the Wilson equation and latter extended to NRTL and UNIQUAC models. Predictive UNIQUAC proved to be extraordinarily successful in predicting the behaviour of both model and real hydrocarbon fluids at low temperatures. This work illustrates the ability of Predictive UNIQUAC in the description of the low temperature behaviour of petroleum fluids. It will be shown that using Predictive UNIQUAC in the description of the solid phase non-ideality a complete prediction of the low temperature behaviour of synthetic paraffin solutions, fuels and crude oils is achieved. The composition of both liquid and solid phases, the amount of crystals formed and the cloud points are predicted within the accuracy of the experimental data. The extension of Predictive UNIQUAC to high pressures, by coupling it with an EOS/G E model based on the SRK EOS used with the LCVM mixing rule, is proposed and predictions of phase envelopes for live oils are compared with experimental data.

  5. A thermodynamic model to predict wax formation in petroleum fluids

    Energy Technology Data Exchange (ETDEWEB)

    Coutinho, J.A.P. [Universidade de Aveiro (Portugal). Dept. de Quimica. Centro de Investigacao em Quimica]. E-mail: jcoutinho@dq.ua.pt; Pauly, J.; Daridon, J.L. [Universite de Pau et des Pays de l' Adour, Pau (France). Lab. des Fluides Complexes

    2001-12-01

    Some years ago the authors proposed a model for the non-ideality of the solid phase, based on the Predictive Local Composition concept. This was first applied to the Wilson equation and latter extended to NRTL and UNIQUAC models. Predictive UNIQUAC proved to be extraordinarily successful in predicting the behaviour of both model and real hydrocarbon fluids at low temperatures. This work illustrates the ability of Predictive UNIQUAC in the description of the low temperature behaviour of petroleum fluids. It will be shown that using Predictive UNIQUAC in the description of the solid phase non-ideality a complete prediction of the low temperature behaviour of synthetic paraffin solutions, fuels and crude oils is achieved. The composition of both liquid and solid phases, the amount of crystals formed and the cloud points are predicted within the accuracy of the experimental data. The extension of Predictive UNIQUAC to high pressures, by coupling it with an EOS/G{sup E} model based on the SRK EOS used with the LCVM mixing rule, is proposed and predictions of phase envelopes for live oils are compared with experimental data. (author)

  6. Alcool as the main responsible for recurrent road accidents

    Directory of Open Access Journals (Sweden)

    Andrea Fabbri

    2006-12-01

    Full Text Available The identification of risk factors for recurrent road accidents is the basis for prevention, but very few studies have been published on predictors of recurrence. Our objective was to determine the main variables predicting recurrent crashes in subjects attending an Emergency Department for injuries after road accidents. Over a 5-year follow-up period, we studied 2354 consecutive adult subjects, treated in the Emergency Department following a road accident in 1998. The variables of the original event were tested for predicting recurrence in a Cox proportional hazard model. During follow-up 390/2,325 (16,8% survivors were treated for injury after a new crash. The overall event rate was 34 per 1000 subject-years. Four variables (age ≤ 32 years, male sex, nighttime crash and blood alcohol concentration > 50 mg/dl were identified as independent predictors of recurrent crash. After adjustment for sex, age and night-time, alcohol was the leading predictor (relative risk 3,73; 95% confidence interval 3,00-4,64. In the presence of the four variables, the recurrence rate was as high as 145 (117-175 events per 1000 subject-years, and alcohol per se accounted for over 75% of events. In the absence of the four variables, the rate was as low as 11 (7-17 events per 1000 subject-years. Alcohol was the most powerful behavioural factor predicting recurrent events in subjects treated in an Emergency Department for injury after road accidents, along with young age, male gender and night-time. There is a call to action for preventing alcohol on the roads.

  7. New technology for accident prevention

    Energy Technology Data Exchange (ETDEWEB)

    Byne, P. [Shiftwork Solutions, Vancouver, BC (Canada)

    2006-07-01

    This power point presentation examined the effects of fatigue in the workplace and presented 3 technologies designed to prevent or monitor fatigue. The relationship between mental fatigue, circadian rhythms and cognitive performance was explored. Details of vigilance related degradations in the workplace were presented, as well as data on fatigue-related accidents and a time-line of meter-reading errors. It was noted that the direct cause of the Exxon Valdez disaster was sleep deprivation. Fatigue related accidents during the Gulf War were reviewed. The effects of fatigue on workplace performance include impaired logical reasoning and decision-making; impaired vigilance and attention; slowed mental operations; loss of situational awareness; slowed reaction time; and short cuts and lapses in optional or self-paced behaviours. New technologies to prevent fatigue-related accidents include (1) the driver fatigue monitor, an infra-red camera and computer that tracks a driver's slow eye-lid closures to prevent fatigue related accidents; (2) a fatigue avoidance scheduling tool (FAST) which collects actigraphs of sleep activity; and (3) SAFTE, a sleep, activity, fatigue and effectiveness model. refs., tabs., figs.

  8. Models for short term malaria prediction in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Galappaththy Gawrie NL

    2008-05-01

    Full Text Available Abstract Background Malaria in Sri Lanka is unstable and fluctuates in intensity both spatially and temporally. Although the case counts are dwindling at present, given the past history of resurgence of outbreaks despite effective control measures, the control programmes have to stay prepared. The availability of long time series of monitored/diagnosed malaria cases allows for the study of forecasting models, with an aim to developing a forecasting system which could assist in the efficient allocation of resources for malaria control. Methods Exponentially weighted moving average models, autoregressive integrated moving average (ARIMA models with seasonal components, and seasonal multiplicative autoregressive integrated moving average (SARIMA models were compared on monthly time series of district malaria cases for their ability to predict the number of malaria cases one to four months ahead. The addition of covariates such as the number of malaria cases in neighbouring districts or rainfall were assessed for their ability to improve prediction of selected (seasonal ARIMA models. Results The best model for forecasting and the forecasting error varied strongly among the districts. The addition of rainfall as a covariate improved prediction of selected (seasonal ARIMA models modestly in some districts but worsened prediction in other districts. Improvement by adding rainfall was more frequent at larger forecasting horizons. Conclusion Heterogeneity of patterns of malaria in Sri Lanka requires regionally specific prediction models. Prediction error was large at a minimum of 22% (for one of the districts for one month ahead predictions. The modest improvement made in short term prediction by adding rainfall as a covariate to these prediction models may not be sufficient to merit investing in a forecasting system for which rainfall data are routinely processed.

  9. Predictive error analysis for a water resource management model

    Science.gov (United States)

    Gallagher, Mark; Doherty, John

    2007-02-01

    SummaryIn calibrating a model, a set of parameters is assigned to the model which will be employed for the making of all future predictions. If these parameters are estimated through solution of an inverse problem, formulated to be properly posed through either pre-calibration or mathematical regularisation, then solution of this inverse problem will, of necessity, lead to a simplified parameter set that omits the details of reality, while still fitting historical data acceptably well. Furthermore, estimates of parameters so obtained will be contaminated by measurement noise. Both of these phenomena will lead to errors in predictions made by the model, with the potential for error increasing with the hydraulic property detail on which the prediction depends. Integrity of model usage demands that model predictions be accompanied by some estimate of the possible errors associated with them. The present paper applies theory developed in a previous work to the analysis of predictive error associated with a real world, water resource management model. The analysis offers many challenges, including the fact that the model is a complex one that was partly calibrated by hand. Nevertheless, it is typical of models which are commonly employed as the basis for the making of important decisions, and for which such an analysis must be made. The potential errors associated with point-based and averaged water level and creek inflow predictions are examined, together with the dependence of these errors on the amount of averaging involved. Error variances associated with predictions made by the existing model are compared with "optimized error variances" that could have been obtained had calibration been undertaken in such a way as to minimize predictive error variance. The contributions by different parameter types to the overall error variance of selected predictions are also examined.

  10. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  11. Ecosystem model-based approach for modeling the dynamics of 137Cs transfer to marine plankton populations: application to the western North Pacific Ocean after the Fukushima nuclear power plant accident

    OpenAIRE

    M. Belharet; Estournel, C.; Charmasson, S.

    2016-01-01

    Huge amounts of radionuclides, especially 137Cs, were released into the western North Pacific Ocean after the Fukushima nuclear power plant (FNPP) accident that occurred on 11 March 2011, resulting in contamination of the marine biota. In this study we developed a radioecological model to estimate 137Cs concentrations in phytoplankton and zooplankton populations representing the lower levels of the pelagic trophic chain. We coupled this model to a lower trophic level ecosyst...

  12. MJO prediction skill, predictability, and teleconnection impacts in the Beijing Climate Center Atmospheric General Circulation Model

    Science.gov (United States)

    Wu, Jie; Ren, Hong-Li; Zuo, Jinqing; Zhao, Chongbo; Chen, Lijuan; Li, Qiaoping

    2016-09-01

    This study evaluates performance of Madden-Julian oscillation (MJO) prediction in the Beijing Climate Center Atmospheric General Circulation Model (BCC_AGCM2.2). By using the real-time multivariate MJO (RMM) indices, it is shown that the MJO prediction skill of BCC_AGCM2.2 extends to about 16-17 days before the bivariate anomaly correlation coefficient drops to 0.5 and the root-mean-square error increases to the level of the climatological prediction. The prediction skill showed a seasonal dependence, with the highest skill occurring in boreal autumn, and a phase dependence with higher skill for predictions initiated from phases 2-4. The results of the MJO predictability analysis showed that the upper bounds of the prediction skill can be extended to 26 days by using a single-member estimate, and to 42 days by using the ensemble-mean estimate, which also exhibited an initial amplitude and phase dependence. The observed relationship between the MJO and the North Atlantic Oscillation was accurately reproduced by BCC_AGCM2.2 for most initial phases of the MJO, accompanied with the Rossby wave trains in the Northern Hemisphere extratropics driven by MJO convection forcing. Overall, BCC_AGCM2.2 displayed a significant ability to predict the MJO and its teleconnections without interacting with the ocean, which provided a useful tool for fully extracting the predictability source of subseasonal prediction.

  13. Performance Predictable ServiceBSP Model for Grid Computing

    Institute of Scientific and Technical Information of China (English)

    TONG Weiqin; MIAO Weikai

    2007-01-01

    This paper proposes a performance prediction model for grid computing model ServiceBSP to support developing high quality applications in grid environment. In ServiceBSP model,the agents carrying computing tasks are dispatched to the local domain of the selected computation services. By using the IP (integer program) approach, the Service Selection Agent selects the computation services with global optimized QoS (quality of service) consideration. The performance of a ServiceBSP application can be predicted according to the performance prediction model based on the QoS of the selected services. The performance prediction model can help users to analyze their applications and improve them by optimized the factors which affects the performance. The experiment shows that the Service Selection Agent can provide ServiceBSP users with satisfied QoS of applications.

  14. Noncausal spatial prediction filtering based on an ARMA model

    Institute of Scientific and Technical Information of China (English)

    Liu Zhipeng; Chen Xiaohong; Li Jingye

    2009-01-01

    Conventional f-x prediction filtering methods are based on an autoregressive model. The error section is first computed as a source noise but is removed as additive noise to obtain the signal, which results in an assumption inconsistency before and after filtering. In this paper, an autoregressive, moving-average model is employed to avoid the model inconsistency. Based on the ARMA model, a noncasual prediction filter is computed and a self-deconvolved projection filter is used for estimating additive noise in order to suppress random noise. The 1-D ARMA model is also extended to the 2-D spatial domain, which is the basis for noncasual spatial prediction filtering for random noise attenuation on 3-D seismic data. Synthetic and field data processing indicate this method can suppress random noise more effectively and preserve the signal simultaneously and does much better than other conventional prediction filtering methods.

  15. Predicting and Modelling of Survival Data when Cox's Regression Model does not hold

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects...

  16. Ruthenium release from fuel in accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Brillant, G.; Marchetto, C.; Plumecocq, W. [Inst. de Radioprotection et de Surete Nucleaire, DPAM, SEMIC, LETR and LIMSI, Saint-Paul-Lez-Durance (France)

    2010-07-01

    During a hypothetical nuclear power plant accident, fission products may be released from the fuel matrix and then reach the containment building and the environment. Ruthenium is a very hazardous fission product that can be highly and rapidly released in some accident scenarios. The impact of the atmosphere redox properties, temperature, and fuel burn-up on the ruthenium release is discussed. In order to improve the evaluation of the radiological impact by accident codes, a model of the ruthenium release from fuel is proposed using thermodynamic equilibrium calculations. In addition, a model of fuel oxidation under air is described. Finally, these models have been integrated in the ASTEC accident code and validation calculations have been performed on several experimental tests. (orig.)

  17. Communication and industrial accidents

    NARCIS (Netherlands)

    As, Sicco van

    2001-01-01

    This paper deals with the influence of organizational communication on safety. Accidents are actually caused by individual mistakes. However the underlying causes of accidents are often organizational. As a link between these two levels - the organizational failures and mistakes - I suggest the conc

  18. Accidents - personal factors

    Energy Technology Data Exchange (ETDEWEB)

    Zaitsev, S.L.; Tsygankov, A.V.

    1982-03-01

    This paper evaluates influence of selected personal factors on accident rate in underground coal mines in the USSR. Investigations show that so-called organizational factors cause from 80 to 85% of all accidents. About 70% of the organizational factors is associated with social, personal and economic features of personnel. Selected results of the investigations carried out in Donbass mines are discussed. Causes of miner dissatisfaction are reviewed: 14% is caused by unsatisfactory working conditions, 21% by repeated machine failures, 16% by forced labor during days off, 14% by unsatisfactory material supply, 16% by hard physical labor, 19% by other reasons. About 25% of miners injured during work accidents are characterized as highly professionally qualified with automatic reactions, and about 41% by medium qualifications. About 60% of accidents is caused by miners with less than a 3 year period of service. About 15% of accidents occurs during the first month after a miner has returned from a leave. More than 30% of accidents occurs on the first work day after a day or days off. Distribution of accidents is also presented: 19% of accidents occurs during the first 2 hours of a shift, 36% from the second to the fourth hour, and 45% occurs after the fourth hour and before the shift ends.

  19. Accident investigation and analysis

    NARCIS (Netherlands)

    Kampen, J. van; Drupsteen, L.

    2013-01-01

    Many organisations and companies take extensive proactive measures to identify, evaluate and reduce occupational risks. However, despite these efforts things still go wrong and unintended events occur. After a major incident or accident, conducting an accident investigation is generally the next ste

  20. Application of Technology Acceptance Model in Predicting Behavioral Intention to Use Safety Helmet Reminder System

    Directory of Open Access Journals (Sweden)

    Kamarudin Ambak

    2013-01-01

    Full Text Available Motorcycle is a common and popular mode of transportation in many developing countries. However, statistic of road accidents by the Royal Malaysian Police reveals that motorcyclists are found to be the most vulnerable road users as compared to users of other vehicles. This is due to the lack of safety protection and instability of motorcycles themselves. Despite the usefulness and effectiveness of safety helmet to prevent head injuries, majority of motorcycle users do not wear and fasten their helmet properly. This study presents a new approach in enhancing the safety of motorcycle riders through proper usage of safety helmet. The Technology Acceptance Model (TAM was adopted in predicting the behavioral intention to use Safety Helmet Reminder (SHR system towards a more proper helmet usage among motorcyclists. A multivariate analysis technique, known as Structural Equation Modeling (SEM was used in modeling exercise. Results showed that the construct variables in TAM were found to be reliable and statistically significant. The evaluation of full structural model (TAM showed the goodness-of-fit indices such as Goodness of Fit Index (GFI, Adjusted Goodness of Fit Index (AGFI, Comparative of Fit Index (CFI and Tucker Lewis Index (TLI were greater 0.9 and Root Means Square Error Approximation (RMSEA was less than 0.08. Perceived ease of use was found as strong predictors than perceived usefulness regarding behavioral intention to use SHR. In addition, this study postulates that behavioral intention to use SHR has direct effect on the proper usage of safety helmet significantly.

  1. Aerodynamic Noise Prediction Using stochastic Turbulence Modeling

    Directory of Open Access Journals (Sweden)

    Arash Ahmadzadegan

    2008-01-01

    Full Text Available Amongst many approaches to determine the sound propagated from turbulent flows, hybrid methods, in which the turbulent noise source field is computed or modeled separately from the far field calculation, are frequently used. For basic estimation of sound propagation, less computationally intensive methods can be developed using stochastic models of the turbulent fluctuations (turbulent noise source field. A simple and easy to use stochastic model for generating turbulent velocity fluctuations called continuous filter white noise (CFWN model was used. This method based on the use of classical Langevian-equation to model the details of fluctuating field superimposed on averaged computed quantities. The resulting sound field due to the generated unsteady flow field was evaluated using Lighthill's acoustic analogy. Volume integral method used for evaluating the acoustic analogy. This formulation presents an advantage, as it confers the possibility to determine separately the contribution of the different integral terms and also integration regions to the radiated acoustic pressure. Our results validated by comparing the directivity and the overall sound pressure level (OSPL magnitudes with the available experimental results. Numerical results showed reasonable agreement with the experiments, both in maximum directivity and magnitude of the OSPL. This method presents a very suitable tool for the noise calculation of different engineering problems in early stages of the design process where rough estimates using cheaper methods are needed for different geometries.

  2. A Predictive Model of High Shear Thrombus Growth.

    Science.gov (United States)

    Mehrabadi, Marmar; Casa, Lauren D C; Aidun, Cyrus K; Ku, David N

    2016-08-01

    The ability to predict the timescale of thrombotic occlusion in stenotic vessels may improve patient risk assessment for thrombotic events. In blood contacting devices, thrombosis predictions can lead to improved designs to minimize thrombotic risks. We have developed and validated a model of high shear thrombosis based on empirical correlations between thrombus growth and shear rate. A mathematical model was developed to predict the growth of thrombus based on the hemodynamic shear rate. The model predicts thrombus deposition based on initial geometric and fluid mechanic conditions, which are updated throughout the simulation to reflect the changing lumen dimensions. The model was validated by comparing predictions against actual thrombus growth in six separate in vitro experiments: stenotic glass capillary tubes (diameter = 345 µm) at three shear rates, the PFA-100(®) system, two microfluidic channel dimensions (heights = 300 and 82 µm), and a stenotic aortic graft (diameter = 5.5 mm). Comparison of the predicted occlusion times to experimental results shows excellent agreement. The model is also applied to a clinical angiography image to illustrate the time course of thrombosis in a stenotic carotid artery after plaque cap rupture. Our model can accurately predict thrombotic occlusion time over a wide range of hemodynamic conditions.

  3. The application of modeling and prediction with MRA wavelet network

    Institute of Scientific and Technical Information of China (English)

    LU Shu-ping; YANG Xue-jing; ZHAO Xi-ren

    2004-01-01

    As there are lots of non-linear systems in the real engineering, it is very important to do more researches on the modeling and prediction of non-linear systems. Based on the multi-resolution analysis (MRA) of wavelet theory, this paper combined the wavelet theory with neural network and established a MRA wavelet network with the scaling function and wavelet function as its neurons. From the analysis in the frequency domain, the results indicated that MRA wavelet network was better than other wavelet networks in the ability of approaching to the signals. An essential research was carried out on modeling and prediction with MRA wavelet network in the non-linear system. Using the lengthwise sway data received from the experiment of ship model, a model of offline prediction was established and was applied to the short-time prediction of ship motion. The simulation results indicated that the forecasting model improved the prediction precision effectively, lengthened the forecasting time and had a better prediction results than that of AR linear model.The research indicates that it is feasible to use the MRA wavelet network in the short -time prediction of ship motion.

  4. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    constructed from geological and hydrological data. However, geophysical data are increasingly used to inform hydrogeologic models because they are collected at lower cost and much higher density than geological and hydrological data. Despite increased use of geophysics, it is still unclear whether...... the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... collecting geophysical data. At a minimum, an analysis should be conducted assuming settings that are favorable for the chosen geophysical method. If the analysis suggests that data collected by the geophysical method is unlikely to improve model prediction performance under these favorable settings...

  5. Ecosystem model-based approach for modelling the dynamics of 137Cs transfer to marine plankton populations: application to the western North Pacific Ocean after the Fukushima nuclear power plant accident

    Science.gov (United States)

    Belharet, M.; Estournel, C.; Charmasson, S.

    2015-06-01

    Huge amounts of radionuclides, especially 137Cs, were released into the western North Pacific Ocean after the Fukushima nuclear power plant (FNPP) accident that occurred on 11 March 2011, resulting in contamination of the marine biota. In this study we developed a radioecological model to estimate 137Cs concentrations in phytoplankton and zooplankton populations representing the lower levels of the pelagic trophic chain. We coupled this model to a lower trophic level ecosystem model and an ocean circulation model to take into account the site-specific environmental conditions in the area. The different radioecological parameters of the model were estimated by calibration, and a sensitivity analysis to parameter uncertainties was carried out, showing a high sensitivity of the model results, especially to the 137Cs concentration in seawater, to the rates of uptake from water and to the radionuclide assimilation efficiency for zooplankton. The results of the 137Cs concentrations in planktonic populations simulated in this study were then validated through comparison with the some data available in the region after the accident. The model results have shown that the maximum concentrations in plankton after the accident were about two to four orders of magnitude higher than those observed before the accident depending on the distance from FNPP. Finally, the maximum 137Cs absorbed dose rate for phyto- and zooplankton populations was estimated to be about 10-2 μGy h-1, and was, therefore, lower than the 10 μGy h-1 benchmark value defined in the ERICA assessment approach from which a measurable effect on the marine biota can be observed.

  6. Model Predictive Control of Sewer Networks

    DEFF Research Database (Denmark)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik;

    2016-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored...... and controlled have thus become essential factors for efficient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona...

  7. Prediction model for spring dust weather frequency in North China

    Institute of Scientific and Technical Information of China (English)

    LANG XianMei

    2008-01-01

    It is of great social and scientific importance and also very difficult to make reliable prediction for dust weather frequency (DWF) in North China. In this paper, the correlation between spring DWF in Beijing and Tianjin observation stations, taken as examples in North China, and seasonally averaged surface air temperature, precipitation, Arctic Oscillation, Antarctic Oscillation, South Oscillation, near surface meridional wind and Eurasian westerly index is respectively calculated so as to construct a prediction model for spring DWF in North China by using these climatic factors. Two prediction models, I.e. Model-Ⅰ and model-Ⅱ, are then set up respectively based on observed climate data and the 32-year (1970--2001) extra-seasonal hindcast experiment data as reproduced by the nine-level Atmospheric General Circulation Model developed at the Institute of Atmospheric Physics (IAP9L-AGCM). It is indicated that the correlation coefficient between the observed and predicted DWF reaches 0.933 in the model-Ⅰ, suggesting a high prediction skill one season ahead. The corresponding value is high up to 0.948 for the subsequent model-Ⅱ, which involves synchronous spring climate data reproduced by the IAP9L-AGCM relative to the model-Ⅰ. The model-Ⅱ can not only make more precise prediction but also can bring forward the lead time of real-time prediction from the model-Ⅰ's one season to half year. At last, the real-time predictability of the two models is evaluated. It follows that both the models display high prediction skill for both the interannual variation and linear trend of spring DWF in North China, and each is also featured by different advantages. As for the model-Ⅱ, the prediction skill is much higher than that of original approach by use of the IAP9L-AGCM alone. Therefore, the prediction idea put forward here should be popularized in other regions in China where dust weather occurs frequently.

  8. Prediction model for spring dust weather frequency in North China

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    It is of great social and scientific importance and also very difficult to make reliable prediction for dust weather frequency (DWF) in North China. In this paper, the correlation between spring DWF in Beijing and Tianjin observation stations, taken as examples in North China, and seasonally averaged surface air temperature, precipitation, Arctic Oscillation, Antarctic Oscillation, South Oscillation, near surface meridional wind and Eurasian westerly index is respectively calculated so as to construct a prediction model for spring DWF in North China by using these climatic factors. Two prediction models, i.e. model-I and model-II, are then set up respectively based on observed climate data and the 32-year (1970 -2001) extra-seasonal hindcast experiment data as reproduced by the nine-level Atmospheric General Circulation Model developed at the Institute of Atmospheric Physics (IAP9L-AGCM). It is indicated that the correlation coefficient between the observed and predicted DWF reaches 0.933 in the model-I, suggesting a high prediction skill one season ahead. The corresponding value is high up to 0.948 for the subsequent model-II, which involves synchronous spring climate data reproduced by the IAP9L-AGCM relative to the model-I. The model-II can not only make more precise prediction but also can bring forward the lead time of real-time prediction from the model-I’s one season to half year. At last, the real-time predictability of the two models is evaluated. It follows that both the models display high prediction skill for both the interannual variation and linear trend of spring DWF in North China, and each is also featured by different advantages. As for the model-II, the prediction skill is much higher than that of original approach by use of the IAP9L-AGCM alone. Therefore, the prediction idea put forward here should be popularized in other regions in China where dust weather occurs frequently.

  9. Bus accident severity and passenger injury: evidence from Denmark

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Kaplan, Sigal

    2014-01-01

    examining occurrence of injury to bus passengers. Results Bus accident severity is positively related to (i) the involvement of vulnerable road users, (ii) high speed limits, (iii) night hours, (iv) elderly drivers of the third party involved, and (v) bus drivers and other drivers crossing in yellow or red...... principle of sustainable transit and advance the vision “every accident is one too many”. Methods Bus accident data were retrieved from the national accident database for the period 2002–2011. A generalized ordered logit model allows analyzing bus accident severity and a logistic regression enables...

  10. New Approaches for Channel Prediction Based on Sinusoidal Modeling

    Directory of Open Access Journals (Sweden)

    Ekman Torbjörn

    2007-01-01

    Full Text Available Long-range channel prediction is considered to be one of the most important enabling technologies to future wireless communication systems. The prediction of Rayleigh fading channels is studied in the frame of sinusoidal modeling in this paper. A stochastic sinusoidal model to represent a Rayleigh fading channel is proposed. Three different predictors based on the statistical sinusoidal model are proposed. These methods outperform the standard linear predictor (LP in Monte Carlo simulations, but underperform with real measurement data, probably due to nonstationary model parameters. To mitigate these modeling errors, a joint moving average and sinusoidal (JMAS prediction model and the associated joint least-squares (LS predictor are proposed. It combines the sinusoidal model with an LP to handle unmodeled dynamics in the signal. The joint LS predictor outperforms all the other sinusoidal LMMSE predictors in suburban environments, but still performs slightly worse than the standard LP in urban environments.

  11. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  12. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  13. Consequences of Windscale accident (October 1957) and study of the validity of the Sutton's mathematical model of atmospheric diffusion (1960); Etude des consequences de l'accident de Windscale (Octobre 1957) et de la validite du modele mathematique de diffusion atmospherique de Sutton (1960)

    Energy Technology Data Exchange (ETDEWEB)

    Doury, A. [Commissariat a l' Energie Atomique (S.C.R.G.R.) Saclay (France).Centre d' Etudes Nucleaires; Martin, J.J. [Electricite de France (EDF)(S.L.P.R.), 37 - Chinon (France)

    1960-07-01

    The reactor accident that happens at the number 1 pile of Windscale in 1957 was followed by a discharge of radioactive products into the atmosphere from the 1.X.1957 at 4.30 PM to the 12.X.1957 at 3.10 PM. On october the 11{sup th} it was possible to say that there was no more risk either of external irradiation or inhalation. But in adopting a M.A.C. of 0,1 {mu}curie of iodine 131 per litre of milk, the Authority had to control the milk delivery till november 23{sup rd} on a 500 km{sup 2} area. On the other hand, this exceptional accident permit to verify that Sutton's atmospheric diffusion model could give an easy means to foresee, with a sufficient approximation, the consequences of a dispersion of radioactive products into the atmosphere. (author) [French] L'accident survenu a la pile numero 1 de Windscale en 1957 a entraine l'emission de matieres radioactives dans l'atmosphere du 10 octobre a 16h30 au 12 octobre a 15h10. Le 11 octobre, on pouvait dire qu'il n'y avait plus de risque d'irradiation externe ni de danger par inhalation. Mais en adoptant une C.M.A. de 0,1 {mu}curie d'iode 131 par litre de lait, les autorites ont du reglementer la consommation du lait jusqu'au 23 novembre sur une etendue d'environ 500 km{sup 2}. D'autre part, cet accident exceptionnel a permis de verifier que le modele de diffusion atmospherique de Sutton pouvait fournir un moyen commode de prevoir avec une approximation suffisante les consequences d'une dispersion de produits radioactifs dans l'atmosphere. (auteur)

  14. An evaluation of mathematical models for predicting skin permeability.

    Science.gov (United States)

    Lian, Guoping; Chen, Longjian; Han, Lujia

    2008-01-01

    A number of mathematical models have been proposed for predicting skin permeability, mostly empirical and very few are deterministic. Early empirical models use simple lipophilicity parameters. The recent trend is to use more complicated molecular structure descriptors. There has been much debate on which models best predict skin permeability. This article evaluates various mathematical models using a comprehensive experimental dataset of skin permeability for 124 chemical compounds compiled from various sources. Of the seven models compared, the deterministic model of Mitragotri gives the best prediction. The simple quantitative structure permeability relationships (QSPR) model of Potts and Guy gives the second best prediction. The two models have many features in common. Both assume the lipid matrix as the pathway of transdermal permeation. Both use octanol-water partition coefficient and molecular size. Even the mathematical formulae are similar. All other empirical QSPR models that use more complicated molecular structure descriptors fail to provide satisfactory prediction. The molecular structure descriptors in the more complicated QSPR models are empirically related to skin permeation. The mechanism on how these descriptors affect transdermal permeation is not clear. Mathematically it is an ill-defined approach to use many colinearly related parameters rather than fewer independent parameters in multi-linear regression.

  15. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    Tao Wu; Edward Lester; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical, Environmental and Mining Engineering

    2006-05-15

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coal particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.

  16. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  17. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  18. Prediction of bypass transition with differential Reynolds stress models

    NARCIS (Netherlands)

    Westin, K.J.A.; Henkes, R.A.W.M.

    1998-01-01

    Boundary layer transition induced by high levels of free stream turbulence (FSl), so called bypass transition, can not be predicted with conventional stability calculations (e.g. the en-method). The use of turbulence models for transition prediction has shown some success for this type of flows, and

  19. Improving Environmental Model Calibration and Prediction

    Science.gov (United States)

    2011-01-18

    1. model comparison and selection, 2. identification of the best water management strategies that reflect the likelihood of outcomes, 3. data...comparison and selection, identification of the best water management strategies that reflect the likelihood of outcomes, data collection aimed at

  20. Space Weather: Measurements, Models and Predictions

    Science.gov (United States)

    2014-03-21

    Freezes with valid Optional Storm added in fresh AF- GEOSpace session. Results in "invalid storm peak" message. • Raytrace App: Save Model message...plot windows created by LET-APP, RAYTRACE -APP, and PPS causes crash. • LET-APP: Use of "Trapped Protons: CRRESPRO Quiet" or "Active" results in no

  1. Predicting Magazine Audiences with a Loglinear Model.

    Science.gov (United States)

    1987-07-01

    important use of e.d. estimates is in media selection ( Aaker 1975; Lee 1962, 1963; Little and Lodish 1969). All advertising campaigns have a budget. It...N.Z. Listener 6061 39.0 4 0 22 References Aaker , D.A. (1975), "ADMOD:An Advertising Decision Model," Journal of Marketing Research, February, 37-45

  2. Behaviour of oceanic 137Cs following the Fukushima Daiichi Nuclear Power Plant accident for four years simulated numerically by a regional ocean model

    Science.gov (United States)

    Tsumune, D.; Tsubono, T.; Aoyama, M.; Misumi, K.; Tateda, Y.

    2015-12-01

    A series of accidents at the Fukushima Dai-ichi Nuclear Power Plant (1F NPP) following the earthquake and tsunami of 11 March 2011 resulted in the release of radioactive materials to the ocean by two major pathways, direct release from the accident site and atmospheric deposition.We reconstructed spatiotemporal variability of 137Cs activity in the regional ocean for four years by numerical model, such as a regional scale and the North Pacific scale oceanic dispersion models, an atmospheric transport model, a sediment transport model, a dynamic biological compartment model for marine biota and river runoff model. Direct release rate of 137Cs were estimated for four years after the accident by comparing simulated results and observed activities very close to the site. The estimated total amounts of directly release was 3.6±0.7 PBq. Directly release rate of 137Cs decreased exponentially with time by the end of December 2012 and then, was almost constant. Decrease rate were quite small after 2013. The daily release rate of 137Cs was estimated to be the order of magnitude of 1010 Bq/day by the end of March 2015. The activity of directly released 137Cs was detectable only in the coastal zone after December 2012. Simulated 137Cs activities attributable to direct release were in good agreement with observed activities, a result that implies the estimated direct release rate was reasonable. There is no observed data of 137Cs activity in the ocean from 11 to 21 March 2011. Observed data of marine biota should reflect the history of 137Cs activity in this early period. We reconstructed the history of 137Cs activity in this early period by considering atmospheric deposition, river input, rain water runoff from the 1F NPP site. The comparisons between simulated 137Cs activity of marine biota by a dynamic biological compartment and observed data also suggest that simulated 137Cs activity attributable to atmospheric deposition was underestimated in this early period. The

  3. Dynamics Model for Bird Strike Accident Early Warning%飞机抗鸟撞的预警动力学模型

    Institute of Scientific and Technical Information of China (English)

    刘双燕; 邓琼; 陈春林

    2012-01-01

    为预防鸟撞事故,根据事故发生过程和开放系统的熵演变过程有很大相似性这一特点,采用信息熵理论解释事故发生过程,进而建立鸟撞事故发生和预警的动力学模型,揭示引发事故的主要原因和预防事故的对策.在此基础上,采用基于熵的综合评价方法,对造成事故的多个因素进行对比分析和综合评价,得出鸟撞事故中飞行时间(季节和昼夜2个因素)、飞行高度、飞机的结构、遭遇的鸟类和发生区域等5大类共6个因素的熵值.结果表明,飞鸟种类的熵值最大,为0.4106;其次是飞机结构因素.因此,预防鸟撞事故应该从飞鸟控制和飞机结构设计着手.%A system dynamics model for bird strike' s occurrence was introduced to study the Bird Strike disaster on airplane. The process of bird strike' s occurrence was explained by information entropy theory based on similarity of the process of bird strike' s occurrence to the evolution of the entropy of an open system. Main causes of the bird strike accident and the countermeasures for preventing it were revealed. U-sing the comprehensive evaluation method based on information entropy, entropy values of the 6 factors causing the accident ( including kind of birds, the timing of the flight, the height of the flight, the structure of aircraft and the area of bird strike' s occurrence) were calculated. The result shows that the kind of birds is the most important factor in bird strike accident, of which entropy value is 0. 410 6, followed by the aircraft structure factor. In a word, bird strike accident prevention should start from the bird control and aircraft structural design.

  4. Scanpath Based N-Gram Models for Predicting Reading Behavior

    DEFF Research Database (Denmark)

    Mishra, Abhijit; Bhattacharyya, Pushpak; Carl, Michael

    2013-01-01

    Predicting reading behavior is a difficult task. Reading behavior depends on various linguistic factors (e.g. sentence length, structural complexity etc.) and other factors (e.g individual's reading style, age etc.). Ideally, a reading model should be similar to a language model where the model i...

  5. Model Predictive Control of Wind Turbines using Uncertain LIDAR Measurements

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Soltani, Mohsen; Poulsen, Niels Kjølstad;

    2013-01-01

    The problem of Model predictive control (MPC) of wind turbines using uncertain LIDAR (LIght Detection And Ranging) measurements is considered. A nonlinear dynamical model of the wind turbine is obtained. We linearize the obtained nonlinear model for different operating points, which are determine...

  6. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  7. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Science.gov (United States)

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  8. On the Predictiveness of Single-Field Inflationary Models

    CERN Document Server

    Burgess, C.P.; Trott, Michael

    2014-01-01

    We re-examine the predictiveness of single-field inflationary models and discuss how an unknown UV completion can complicate determining inflationary model parameters from observations, even from precision measurements. Besides the usual naturalness issues associated with having a shallow inflationary potential, we describe another issue for inflation, namely, unknown UV physics modifies the running of Standard Model (SM) parameters and thereby introduces uncertainty into the potential inflationary predictions. We illustrate this point using the minimal Higgs Inflationary scenario, which is arguably the most predictive single-field model on the market, because its predictions for $A_s$, $r$ and $n_s$ are made using only one new free parameter beyond those measured in particle physics experiments, and run up to the inflationary regime. We find that this issue can already have observable effects. At the same time, this UV-parameter dependence in the Renormalization Group allows Higgs Inflation to occur (in prin...

  9. Compensatory versus noncompensatory models for predicting consumer preferences

    Directory of Open Access Journals (Sweden)

    Anja Dieckmann

    2009-04-01

    Full Text Available Standard preference models in consumer research assume that people weigh and add all attributes of the available options to derive a decision, while there is growing evidence for the use of simplifying heuristics. Recently, a greedoid algorithm has been developed (Yee, Dahan, Hauser and Orlin, 2007; Kohli and Jedidi, 2007 to model lexicographic heuristics from preference data. We compare predictive accuracies of the greedoid approach and standard conjoint analysis in an online study with a rating and a ranking task. The lexicographic model derived from the greedoid algorithm was better at predicting ranking compared to rating data, but overall, it achieved lower predictive accuracy for hold-out data than the compensatory model estimated by conjoint analysis. However, a considerable minority of participants was better predicted by lexicographic strategies. We conclude that the new algorithm will not replace standard tools for analyzing preferences, but can boost the study of situational and individual differences in preferential choice processes.

  10. A Composite Model Predictive Control Strategy for Furnaces

    Institute of Scientific and Technical Information of China (English)

    Hao Zang; Hongguang Li; Jingwen Huang; Jia Wang

    2014-01-01

    Tube furnaces are essential and primary energy intensive facilities in petrochemical plants. Operational optimi-zation of furnaces could not only help to improve product quality but also benefit to reduce energy consumption and exhaust emission. Inspired by this idea, this paper presents a composite model predictive control (CMPC) strategy, which, taking advantage of distributed model predictive control architectures, combines tracking nonlinear model predictive control and economic nonlinear model predictive control metrics to keep process running smoothly and optimize operational conditions. The control ers connected with two kinds of communi-cation networks are easy to organize and maintain, and stable to process interferences. A fast solution algorithm combining interior point solvers and Newton's method is accommodated to the CMPC realization, with reason-able CPU computing time and suitable online applications. Simulation for industrial case demonstrates that the proposed approach can ensure stable operations of furnaces, improve heat efficiency, and reduce the emission effectively.

  11. Using Pareto points for model identification in predictive toxicology.

    Science.gov (United States)

    Palczewska, Anna; Neagu, Daniel; Ridley, Mick

    2013-03-22

    : Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology.

  12. A Multistep Chaotic Model for Municipal Solid Waste Generation Prediction.

    Science.gov (United States)

    Song, Jingwei; He, Jiaying

    2014-08-01

    In this study, a univariate local chaotic model is proposed to make one-step and multistep forecasts for daily municipal solid waste (MSW) generation in Seattle, Washington. For MSW generation prediction with long history data, this forecasting model was created based on a nonlinear dynamic method called phase-space reconstruction. Compared with other nonlinear predictive models, such as artificial neural network (ANN) and partial least square-support vector machine (PLS-SVM), and a commonly used linear seasonal autoregressive integrated moving average (sARIMA) model, this method has demonstrated better prediction accuracy from 1-step ahead prediction to 14-step ahead prediction assessed by both mean absolute percentage error (MAPE) and root mean square error (RMSE). Max error, MAPE, and RMSE show that chaotic models were more reliable than the other three models. As chaotic models do not involve random walk, their performance does not vary while ANN and PLS-SVM make different forecasts in each trial. Moreover, this chaotic model was less time consuming than ANN and PLS-SVM models.

  13. Study of Factors Related to Accidents Occuring during the Construction Phase of Oil, Gas and Petrochemical Projects

    Directory of Open Access Journals (Sweden)

    H Asilian Mahabadi

    2008-01-01

    Full Text Available Introduction: Construction phase in industries is a dynamic process that is naturally and intrinsically dangerous and as it becomes more complicated, the accidents rate also increases. One should note that without considering a model, one could not obtain useful and reliable information and method to prevent accidents. Therefore, to achieve useful methods for preventing accidents, it is desirable to consider a model. The general goal of this study was presentation of a model. A model is the reflection of a fact. In other words, it should be said that the model represents a system or process whose behavior can be predicted. Models are therefore used for understanding the behavior of actual terminals and show a theory in the way that covers important variables for describing phenomena and instead, ignore factors of low importance in the expression of those phenomena. Methods: This study was a research article conducted in 2004-2005 in the Assaluyeh region. Data was gathered from accident reports present in security and health records of the projects and also statistics present at the treatment centers. In this study, an analytical model (multi-regression was presented to describe the impact of effective and deep factors on the possibility of an increase in accidents leading to death, through measurement of the effects of independent variables on the dependent variables. For this purpose, the structure of 50 accidents that led to death were studied along with another 2700 accidents, and after studying the accident reports and related documents, observing operations and equipment, counseling with accident observers and an expert team of managers, supervisors and engineers, and simulation of some accidents, unsafe conditions and functions, mismanagement and use of worn out and defective tools, equipment, devices and machinery were considered as the four independent variables and the job accidents leading to death were considered as dependent

  14. Haskell financial data modeling and predictive analytics

    CERN Document Server

    Ryzhov, Pavel

    2013-01-01

    This book is a hands-on guide that teaches readers how to use Haskell's tools and libraries to analyze data from real-world sources in an easy-to-understand manner.This book is great for developers who are new to financial data modeling using Haskell. A basic knowledge of functional programming is not required but will be useful. An interest in high frequency finance is essential.

  15. Mesoscale Wind Predictions for Wave Model Evaluation

    Science.gov (United States)

    2016-06-07

    N0001400WX20041(B) http://www.nrlmry.navy.mil LONG TERM GOALS The long-term goal is to demonstrate the significance and importance of high...ocean waves by an appropriate wave model. OBJECTIVES The main objectives of this project are to: 1. Build the infrastructure to generate the...temperature for all COAMPS grids at the resolution of each of these grids. These analyses are important for the proper 2 specification of the lower

  16. A Predictive Multiscale Model of Wear

    Science.gov (United States)

    2011-03-09

    theoretical tensile strength, and by fitting the calculated data to universal binding energy relationships ( UBERs ), which permit the extrapolation of the...calculated results to arbitrary length scales. The results demonstrate the ability of an UBER that accounts for fracture surface relaxation to yield a...materials subjected to shear up to the point at which slip occurs. The model we devised is analogous to the tensile-load UBER and leads to a size

  17. Modeling Seizure Self-Prediction: An E-Diary Study

    Science.gov (United States)

    Haut, Sheryl R.; Hall, Charles B.; Borkowski, Thomas; Tennen, Howard; Lipton, Richard B.

    2013-01-01

    Purpose A subset of patients with epilepsy successfully self-predicted seizures in a paper diary study. We conducted an e-diary study to ensure that prediction precedes seizures, and to characterize the prodromal features and time windows that underlie self-prediction. Methods Subjects 18 or older with LRE and ≥3 seizures/month maintained an e-diary, reporting AM/PM data daily, including mood, premonitory symptoms, and all seizures. Self-prediction was rated by, “How likely are you to experience a seizure [time frame]”? Five choices ranged from almost certain (>95% chance) to very unlikely. Relative odds of seizure (OR) within time frames was examined using Poisson models with log normal random effects to adjust for multiple observations. Key Findings Nineteen subjects reported 244 eligible seizures. OR for prediction choices within 6hrs was as high as 9.31 (1.92,45.23) for “almost certain”. Prediction was most robust within 6hrs of diary entry, and remained significant up to 12hrs. For 9 best predictors, average sensitivity was 50%. Older age contributed to successful self-prediction, and self-prediction appeared to be driven by mood and premonitory symptoms. In multivariate modeling of seizure occurrence, self-prediction (2.84; 1.68,4.81), favorable change in mood (0.82; 0.67,0.99) and number of premonitory symptoms (1,11; 1.00,1.24) were significant. Significance Some persons with epilepsy can self-predict seizures. In these individuals, the odds of a seizure following a positive prediction are high. Predictions were robust, not attributable to recall bias, and were related to self awareness of mood and premonitory features. The 6-hour prediction window is suitable for the development of pre-emptive therapy. PMID:24111898

  18. Webinar of paper 2013, Which method predicts recidivism best? A comparison of statistical, machine learning and data mining predictive models

    NARCIS (Netherlands)

    Tollenaar, N.; Van der Heijden, P.G.M.

    2013-01-01

    Using criminal population criminal conviction history information, prediction models are developed that predict three types of criminal recidivism: general recidivism, violent recidivism and sexual recidivism. The research question is whether prediction techniques from modern statistics, data mining

  19. Models for fleet sizing and localization of fire-fighting for attendance to accidents in platforms; Modelos para dimensionamento de frota e localizacao de embarcacoes fire-fighting para atendimento a acidentes em plataformas

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Afonso Celso; Brinati, Marco Antonio [Sao Paulo Univ., SP (Brazil). Escola Politecnica

    1996-12-31

    The increasing degree of use of maritime resources claims the establishment of a modern protection and assistance system to prevent and control maritime accidents. The usual safety systems for maritime accidents, generally, have the aid of specialized fleets in the attendance. This work presents models to determine the location and the profile of a specialized fleet for fire fighting, in order to guarantee the adequate attendance to expected accidents in a marine oil field. To modelling the problem, two means of analysis are considered: a deterministic model of integer programming and a probabilistic model. Considering the geographic location and the size of platforms as input data, the deterministic model establishes, among the available vessels, the fleet profile and location in order to minimize the fleet cost assuring the attendance to each platform within the standard requirements. The probabilistic model starts from a given solution for the fleet profile and vessel location and, by means of estimating the utilization factors of each vessel, proposes possible improvements in the fleet location, in order to maximize the probability of attending the accidents. A simulation model was elaborated to validate the results from the probabilistic model. The obtained results indicate the usefulness of every model, not only to a rational location problem solution, but also, for the analysis of the operational fleet performance. (author) 11 refs., 4 figs., 4 tabs.

  20. Prediction of cloud droplet number in a general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, S.J.; Leung, L.R. [Pacific Northwest National Lab., Richland, WA (United States)

    1996-04-01

    We have applied the Colorado State University Regional Atmospheric Modeling System (RAMS) bulk cloud microphysics parameterization to the treatment of stratiform clouds in the National Center for Atmospheric Research Community Climate Model (CCM2). The RAMS predicts mass concentrations of cloud water, cloud ice, rain and snow, and number concnetration of ice. We have introduced the droplet number conservation equation to predict droplet number and it`s dependence on aerosols.

  1. Model predictive control of P-time event graphs

    Science.gov (United States)

    Hamri, H.; Kara, R.; Amari, S.

    2016-12-01

    This paper deals with model predictive control of discrete event systems modelled by P-time event graphs. First, the model is obtained by using the dater evolution model written in the standard algebra. Then, for the control law, we used the finite-horizon model predictive control. For the closed-loop control, we used the infinite-horizon model predictive control (IH-MPC). The latter is an approach that calculates static feedback gains which allows the stability of the closed-loop system while respecting the constraints on the control vector. The problem of IH-MPC is formulated as a linear convex programming subject to a linear matrix inequality problem. Finally, the proposed methodology is applied to a transportation system.

  2. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    Science.gov (United States)

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  3. Traffic Prediction Scheme based on Chaotic Models in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Xiangrong Feng

    2013-09-01

    Full Text Available Based on the local support vector algorithm of chaotic time series analysis, the Hannan-Quinn information criterion and SAX symbolization are introduced. Then a novel prediction algorithm is proposed, which is successfully applied to the prediction of wireless network traffic. For the correct prediction problems of short-term flow with smaller data set size, the weakness of the algorithms during model construction is analyzed by study and comparison to LDK prediction algorithm. It is verified the Hannan-Quinn information principle can be used to calculate the number of neighbor points to replace pervious empirical method, which uses the number of neighbor points to acquire more accurate prediction model. Finally, actual flow data is applied to confirm the accuracy rate of the proposed algorithm LSDHQ. It is testified by our experiments that it also has higher performance in adaptability than that of LSDHQ algorithm.

  4. Toward a predictive model for elastomer seals

    Science.gov (United States)

    Molinari, Nicola; Khawaja, Musab; Sutton, Adrian; Mostofi, Arash

    Nitrile butadiene rubber (NBR) and hydrogenated-NBR (HNBR) are widely used elastomers, especially as seals in oil and gas applications. During exposure to well-hole conditions, ingress of gases causes degradation of performance, including mechanical failure. We use computer simulations to investigate this problem at two different length and time-scales. First, we study the solubility of gases in the elastomer using a chemically-inspired description of HNBR based on the OPLS all-atom force-field. Starting with a model of NBR, C=C double bonds are saturated with either hydrogen or intramolecular cross-links, mimicking the hydrogenation of NBR to form HNBR. We validate against trends for the mass density and glass transition temperature for HNBR as a function of cross-link density, and for NBR as a function of the fraction of acrylonitrile in the copolymer. Second, we study mechanical behaviour using a coarse-grained model that overcomes some of the length and time-scale limitations of an all-atom approach. Nanoparticle fillers added to the elastomer matrix to enhance mechanical response are also included. Our initial focus is on understanding the mechanical properties at the elevated temperatures and pressures experienced in well-hole conditions.

  5. Practical approaches in accident analysis

    Science.gov (United States)

    Stock, M.

    An accident analysis technique based on successive application of structural response, explosion dynamics, gas cloud formation, and plant operation failure mode models is proposed. The method takes into account the nonideal explosion characteristic of a deflagration in the unconfined cloud. The resulting pressure wave differs significantly from a shock wave and the response of structures like lamp posts and walls can differ correspondingly. This gives a more realistic insight into explosion courses than a simple TNT-equivalent approach.

  6. Using a Prediction Model to Manage Cyber Security Threats.

    Science.gov (United States)

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  7. Using a Prediction Model to Manage Cyber Security Threats

    Directory of Open Access Journals (Sweden)

    Venkatesh Jaganathan

    2015-01-01

    Full Text Available Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  8. Active diagnosis of hybrid systems - A model predictive approach

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Ravn, Anders P.; Izadi-Zamanabadi, Roozbeh;

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty...... outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeated until the fault is detected by a passive diagnoser. It is demonstrated how the generated excitation signal...

  9. Aero-acoustic noise of wind turbines. Noise prediction models

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B. [ed.

    1997-12-31

    Semi-empirical and CAA (Computational AeroAcoustics) noise prediction techniques are the subject of this expert meeting. The meeting presents and discusses models and methods. The meeting may provide answers to the following questions: What Noise sources are the most important? How are the sources best modeled? What needs to be done to do better predictions? Does it boil down to correct prediction of the unsteady aerodynamics around the rotor? Or is the difficult part to convert the aerodynamics into acoustics? (LN)

  10. Model Predictive Control of a Wave Energy Converter

    DEFF Research Database (Denmark)

    Andersen, Palle; Pedersen, Tom Søndergård; Nielsen, Kirsten Mølgaard;

    2015-01-01

    In this paper reactive control and Model Predictive Control (MPC) for a Wave Energy Converter (WEC) are compared. The analysis is based on a WEC from Wave Star A/S designed as a point absorber. The model predictive controller uses wave models based on the dominating sea states combined with a model...... connecting undisturbed wave sequences to sequences of torque. Losses in the conversion from mechanical to electrical power are taken into account in two ways. Conventional reactive controllers are tuned for each sea state with the assumption that the converter has the same efficiency back and forth. MPC...

  11. Modelling and prediction of non-stationary optical turbulence behaviour

    Science.gov (United States)

    Doelman, Niek; Osborn, James

    2016-07-01

    There is a strong need to model the temporal fluctuations in turbulence parameters, for instance for scheduling, simulation and prediction purposes. This paper aims at modelling the dynamic behaviour of the turbulence coherence length r0, utilising measurement data from the Stereo-SCIDAR instrument installed at the Isaac Newton Telescope at La Palma. Based on an estimate of the power spectral density function, a low order stochastic model to capture the temporal variability of r0 is proposed. The impact of this type of stochastic model on the prediction of the coherence length behaviour is shown.

  12. Research on Drag Torque Prediction Model for the Wet Clutches

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Considering the surface tension effect and centrifugal effect, a mathematical model based on Reynolds equation for predicting the drag torque of disengage wet clutches is presented. The model indicates that the equivalent radius is a function of clutch speed and flow rate. The drag torque achieves its peak at a critical speed. Above this speed, drag torque drops due to the shrinking of the oil film. The model also points out that viscosity and flow rate effects on drag torque. Experimental results indicate that the model is reasonable and it performs well for predicting the drag torque peak.

  13. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  14. Development and application of chronic disease risk prediction models.

    Science.gov (United States)

    Oh, Sun Min; Stefani, Katherine M; Kim, Hyeon Chang

    2014-07-01

    Currently, non-communicable chronic diseases are a major cause of morbidity and mortality worldwide, and a large proportion of chronic diseases are preventable through risk factor management. However, the prevention efficacy at the individual level is not yet satisfactory. Chronic disease prediction models have been developed to assist physicians and individuals in clinical decision-making. A chronic disease prediction model assesses multiple risk factors together and estimates an absolute disease risk for the individual. Accurate prediction of an individual's future risk for a certain disease enables the comparison of benefits and risks of treatment, the costs of alternative prevention strategies, and selection of the most efficient strategy for the individual. A large number of chronic disease prediction models, especially targeting cardiovascular diseases and cancers, have been suggested, and some of them have been adopted in the clinical practice guidelines and recommendations of many countries. Although few chronic disease prediction tools have been suggested in the Korean population, their clinical utility is not as high as expected. This article reviews methodologies that are commonly used for developing and evaluating a chronic disease prediction model and discusses the current status of chronic disease prediction in Korea.

  15. A balance procedure for calculating the model fuel assemblies reflooding during design basis accident and its verification on PARAMETER test facility

    Science.gov (United States)

    Bazyuk, S. S.; Ignat'ev, D. N.; Parshin, N. Ya.; Popov, E. B.; Soldatkin, D. M.; Kuzma-Kichta, Yu. A.

    2013-05-01

    A balance procedure is proposed for estimating the main parameters characterizing the process of model fuel assemblies reflooding of a VVER reactor made on different scales under the conditions of a design basis accident by subjecting them to bottom reflooding1. The proposed procedure satisfactorily describes the experimental data obtained on PARAMETER test facility in the temperature range up to 1200°C. The times of fuel assemblies quenching by bottom reflooding calculated using the proposed procedure are in satisfactory agreement with the experimental data obtained on model fuel assemblies of VVER- and PWR-type reactors and can be used in developing measures aimed at enhancing the safety of nuclear power stations.

  16. Evaluation of burst pressure prediction models for line pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xian-Kui, E-mail: zhux@battelle.org [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States); Leis, Brian N. [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States)

    2012-01-15

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487-492). It is found that these models can