WorldWideScience

Sample records for distributed analysis test

  1. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  2. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  3. Distributed analysis functional testing using GangaRobot in the ATLAS experiment

    Science.gov (United States)

    Legger, Federica; ATLAS Collaboration

    2011-12-01

    Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.

  4. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  5. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  6. The Analysis of process optimization during the loading distribution test for steam turbine

    International Nuclear Information System (INIS)

    Li Jiangwei; Cao Yuhua; Li Dawei

    2014-01-01

    The loading distribution of steam turbine needs six times to complete in total, the first time is completed when the turbine cylinder buckles, the rest must be completed orderly in the process of installing GVP pipe. To complete 5 tests of loading distribution and installation of GVP pipe, it usually takes around 90 days for most nuclear plants while the unit l of Fuqing Nuclear Power Station compress it into about 45 days by optimizing the installation process. this article describes the successful experience of how the Unit l of Fuqing Nuclear Power Station finished 5 tests of loading distribution and installation of GVP pipe in 45 days by optimizing the process, Meanwhile they analysis the advantages and disadvantages through comparing it with the process provide by suppliers, which brings up some rationalization proposals for installation work to the follow-up units of our plant. (authors)

  7. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  8. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  9. An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

    Science.gov (United States)

    Attali, Yigal

    2010-01-01

    Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

  10. Model-Driven Test Generation of Distributed Systems

    Science.gov (United States)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  11. Local flow distribution analysis inside the reactor pools of KALIMER-600 and PDRC performance test facility

    International Nuclear Information System (INIS)

    Jeong, Ji Hwan; Hwang, Seong Won; Choi, Kyeong Sik

    2010-05-01

    In the study, 3-dimensional thermal hydraulic analysis was carried out focusing on the thermal hydraulic behavior inside the reactor pools for both KALIMER-600 and one-fifth scale-down test facility. STAR-CD, one of the commercial CFD codes, was used to analyze 3-dimensional incompressible steady-state thermal hydraulic behavior in both designs of KALIMER-600 and the scale-down test facility. In the KALIMER-600 CFD analysis, the pressure drops in the core and IHX gave a good agreement within 1% error range. It was found that the porous media model was appropriate to analyze the pressure distribution inside reactor core and IHX. Also, a validation analysis showed the pressure drop through the porous media under the condition of 80% flow rate and thermal power was calculated 64% less than in 100% condition giving a physically reasonable analytic result. Since the temperatures in the hot-side pool and cold-side pool were estimated to be very close to 540 and 390 .deg. C specified on the design values respectively, the CFD models of heat source and sink was confirmed. Through the study, the methodology of 3-dimensional CFD analysis about KALIMER-600 has been established and proven. Performed with the methodology, the analysis data such as flow velocity, temperature and pressure distribution were compared by normalizing those data for the actual sized modeling and scale-down modeling. As a result, the characteristics of thermal hydraulic behavior were almost identical for the actual sized modeling and scale-down modeling and the similarity scaling law used in the design of the sodium test facility by KAERI was found to be correct

  12. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  13. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  14. Modification of Kolmogorov-Smirnov test for DNA content data analysis through distribution alignment.

    Science.gov (United States)

    Huang, Shuguang; Yeo, Adeline A; Li, Shuyu Dan

    2007-10-01

    The Kolmogorov-Smirnov (K-S) test is a statistical method often used for comparing two distributions. In high-throughput screening (HTS) studies, such distributions usually arise from the phenotype of independent cell populations. However, the K-S test has been criticized for being overly sensitive in applications, and it often detects a statistically significant difference that is not biologically meaningful. One major reason is that there is a common phenomenon in HTS studies that systematic drifting exists among the distributions due to reasons such as instrument variation, plate edge effect, accidental difference in sample handling, etc. In particular, in high-content cellular imaging experiments, the location shift could be dramatic since some compounds themselves are fluorescent. This oversensitivity of the K-S test is particularly overpowered in cellular assays where the sample sizes are very big (usually several thousands). In this paper, a modified K-S test is proposed to deal with the nonspecific location-shift problem in HTS studies. Specifically, we propose that the distributions are "normalized" by density curve alignment before the K-S test is conducted. In applications to simulation data and real experimental data, the results show that the proposed method has improved specificity.

  15. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  16. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  17. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  18. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  19. Preliminary Calculations of Bypass Flow Distribution in a Multi-Block Air Test

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Tak, Nam Il

    2011-01-01

    The development of a methodology for the bypass flow assessment in a prismatic VHTR (Very High Temperature Reactor) core has been conducted at KAERI. A preliminary estimation of variation of local bypass flow gap size between graphite blocks in the NHDD core were carried out. With the predicted gap sizes, their influence on the bypass flow distribution and the core hot spot was assessed. Due to the complexity of gap distributions, a system thermo-fluid analysis code is suggested as a tool for the core thermo-fluid analysis, the model and correlations of which should be validated. In order to generate data for validating the bypass flow analysis model, an experimental facility for a multi-block air test was constructed at Seoul National University (SNU). This study is focused on the preliminary evaluation of flow distribution in the test section to understand how the flow is distributed and to help the selection of experimental case. A commercial CFD code, ANSYS CFX is used for the analyses

  20. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging

    Science.gov (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.

    2016-08-01

    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  1. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  2. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  3. Similarity Analysis for Reactor Flow Distribution Test and Its Validation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon; Ha, Jung Hui [Heungdeok IT Valley, Yongin (Korea, Republic of); Lee, Taehoo; Han, Ji Woong [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    facility. It was clearly found in Hong et al. In this study the feasibility of the similarity analysis of Hong et al. was examined. The similarity analysis was applied to SFR which has been designed in KAERI (Korea Atomic Energy Research Institute) in order to design the reactor flow distribution test. The length scale was assumed to be 1/5, and the velocity scale 1/2, which bounds the square root of the length scale (1/√5). The CFX calculations for both prototype and model were carried out and the flow field was compared.

  4. Goodness-of-fit tests for the Gompertz distribution

    DEFF Research Database (Denmark)

    Lenart, Adam; Missov, Trifon

    The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing ...... for the mean of the sample hazard and a nested test against the generalized extreme value distributions are discussed. Along with an application to laboratory rat data, critical values calculated by the empirical distribution of the test statistics are also presented.......The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing...

  5. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  6. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  7. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  8. Sodium flow distribution in test fuel assembly P-23B

    International Nuclear Information System (INIS)

    Taylor, J.P.S.

    1978-08-01

    Relatively large cladding diametral increases in the exterior fuel pins of HEDL's test fuel subassembly P-23B were successfully explained by a thermal-hydraulic/solid mechanics analysis. This analysis indicates that while at power, the subassembly flow was less than planned and that the fuel pins were considerably displaced and bowed from their nominal position. In accomplishing this analysis, a method was developed to estimate the sodium flow distribution and pin distortions in a fuel subassembly at power

  9. Distribution of ground rigidity and ground model for seismic response analysis in Hualian project of large scale seismic test

    International Nuclear Information System (INIS)

    Kokusho, T.; Nishi, K.; Okamoto, T.; Tanaka, Y.; Ueshima, T.; Kudo, K.; Kataoka, T.; Ikemi, M.; Kawai, T.; Sawada, Y.; Suzuki, K.; Yajima, K.; Higashi, S.

    1997-01-01

    An international joint research program called HLSST is proceeding. HLSST is large-scale seismic test (LSST) to investigate soil-structure interaction (SSI) during large earthquake in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the gravelly soil in this site, and the backfill material of crushed stone was placed around the model plant after excavation for the construction. Also the model building and the foundation ground were extensively instrumental to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after base excavation, after structure construction and after backfilling. And the distribution of the mechanical properties of the gravelly soil and the backfill are measured after the completion of the construction by penetration test and PS-logging etc. This paper describes the distribution and the change of the shear wave velocity (V s ) measured by the field test. Discussion is made on the effect of overburden pressure during the construction process on V s in the neighbouring soil and, further on the numerical soil model for SSI analysis. (orig.)

  10. Testing the anisotropy in the angular distribution of Fermi/GBM gamma-ray bursts

    Science.gov (United States)

    Tarnopolski, M.

    2017-12-01

    Gamma-ray bursts (GRBs) were confirmed to be of extragalactic origin due to their isotropic angular distribution, combined with the fact that they exhibited an intensity distribution that deviated strongly from the -3/2 power law. This finding was later confirmed with the first redshift, equal to at least z = 0.835, measured for GRB970508. Despite this result, the data from CGRO/BATSE and Swift/BAT indicate that long GRBs are indeed distributed isotropically, but the distribution of short GRBs is anisotropic. Fermi/GBM has detected 1669 GRBs up to date, and their sky distribution is examined in this paper. A number of statistical tests are applied: nearest neighbour analysis, fractal dimension, dipole and quadrupole moments of the distribution function decomposed into spherical harmonics, binomial test and the two-point angular correlation function. Monte Carlo benchmark testing of each test is performed in order to evaluate its reliability. It is found that short GRBs are distributed anisotropically in the sky, and long ones have an isotropic distribution. The probability that these results are not a chance occurrence is equal to at least 99.98 per cent and 30.68 per cent for short and long GRBs, respectively. The cosmological context of this finding and its relation to large-scale structures is discussed.

  11. Auxiliary Heat Exchanger Flow Distribution Test

    International Nuclear Information System (INIS)

    Kaufman, J.S.; Bressler, M.M.

    1983-01-01

    The Auxiliary Heat Exchanger Flow Distribution Test was the first part of a test program to develop a water-cooled (tube-side), compact heat exchanger for removing heat from the circulating gas in a high-temperature gas-cooled reactor (HTGR). Measurements of velocity and pressure were made with various shell side inlet and outlet configurations. A flow configuration was developed which provides acceptable velocity distribution throughout the heat exchanger without adding excessive pressure drop

  12. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  13. First experience and adaptation of existing tools to ATLAS distributed analysis

    International Nuclear Information System (INIS)

    De La Hoz, S.G.; Ruiz, L.M.; Liko, D.

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale in ATLAS. Up to 10000 jobs were processed on about 100 sites in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC file catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval. (orig.)

  14. 10 CFR 431.198 - Enforcement testing for distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Enforcement testing for distribution transformers. 431.198... COMMERCIAL AND INDUSTRIAL EQUIPMENT Distribution Transformers Compliance and Enforcement § 431.198 Enforcement testing for distribution transformers. (a) Test notice. Upon receiving information in writing...

  15. Real time testing of intelligent relays for synchronous distributed generation islanding detection

    Science.gov (United States)

    Zhuang, Davy

    As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.

  16. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  17. A practical test for the choice of mixing distribution in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Bierlaire, Michel

    2007-01-01

    The choice of a specific distribution for random parameters of discrete choice models is a critical issue in transportation analysis. Indeed, various pieces of research have demonstrated that an inappropriate choice of the distribution may lead to serious bias in model forecast and in the estimated...... means of random parameters. In this paper, we propose a practical test, based on seminonparametric techniques. The test is analyzed both on synthetic and real data, and is shown to be simple and powerful. (c) 2007 Elsevier Ltd. All rights reserved....

  18. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  19. Final comparison report on ISP-35: Nupec hydrogen mixing and distribution test (Test M-7-1)

    International Nuclear Information System (INIS)

    1994-12-01

    This final comparison report summarizes the results of the OECD/CSNI sponsored ISP-35 exercise which was based on NUPEC's Hydrogen Mixing and Distribution Test M-7-1. 12 organizations from 10 different countries took part in the exercise. For the ISP-35 test, a steam/light gas (helium) mixture was released into the lower region of a simplified model of a PWR containment. At the same time, the dome cooling spray was also activated. the transient time histories for gas temperature and concentrations were recorded for each of the 25 compartments of the model containment. The wall temperatures as well as the dome pressure were also recorded. The ISP-35 participants simulated the test conditions and attempted to predict the time histories using their accident analysis codes. Results of these analyses are presented, and comparisons are made between the experimental data and the calculated data. In general, predictions for pressure, helium concentration and gas distribution patterns were achieved with acceptable accuracy

  20. Factors affecting daughters distribution among progeny testing Holstein bulls

    Directory of Open Access Journals (Sweden)

    Martino Cassandro

    2012-01-01

    Full Text Available The aim of this study was to investigate factors influencing the number of daughters of Holstein bulls during the progeny testing using data provided by the Italian Holstein Friesian Cattle Breeders Association. The hypothesis is that there are no differences among artificial insemination studs (AIS on the daughters distribution among progeny testing bulls. For each bull and beginning from 21 months of age, the distribution of daughters over the progeny testing period was calculated. Data were available on 1973 bulls born between 1986 and 2004, progeny tested in Italy and with at least 4 paternal half-sibs. On average, bulls exited the genetic centre at 11.3±1.1 months and reached their first official genetic proof at 58.0±3.1 months of age. An analysis of variance was performed on the cumulative frequency of daughters at 24, 36, 48, and 60 months. The generalized linear model included the fixed effects of year of birth of the bull (18 levels, artificial insemination stud (4 levels and sire of bull (137 levels. All effects significantly affected the variability of studied traits. Artificial insemination stud was the most important source of variation, followed by year of birth and sire of bull. Significant differences among AI studs exist, probably reflecting different strategies adopted during progeny testing.

  1. Moisture distribution in sludges based on different testing methods

    Institute of Scientific and Technical Information of China (English)

    Wenyi Deng; Xiaodong Li; Jianhua Yan; Fei Wang; Yong Chi; Kefa Cen

    2011-01-01

    Moisture distributions in municipal sewage sludge, printing and dyeing sludge and paper mill sludge were experimentally studied based on four different methods, i.e., drying test, thermogravimetric-differential thermal analysis (TG-DTA) test, thermogravimetricdifferential scanning calorimetry (TG-DSC) test and water activity test. The results indicated that the moistures in the mechanically dewatered sludges were interstitial water, surface water and bound water. The interstitial water accounted for more than 50% wet basis (wb) of the total moisture content. The bond strength of sludge moisture increased with decreasing moisture content, especially when the moisture content was lower than 50% wb. Furthermore, the comparison among the four different testing methods was presented.The drying test was advantaged by its ability to quantify free water, interstitial water, surface water and bound water; while TG-DSC test, TG-DTA test and water activity test were capable of determining the bond strength of moisture in sludge. It was found that the results from TG-DSC and TG-DTA test are more persuasive than water activity test.

  2. A brief overview of the distribution test grids with a distributed generation inclusion case study

    Directory of Open Access Journals (Sweden)

    Stanisavljević Aleksandar M.

    2018-01-01

    Full Text Available The paper presents an overview of the electric distribution test grids issued by different technical institutions. They are used for testing different scenarios in operation of a grid for research, benchmarking, comparison and other purposes. Their types, main characteristics, features as well as application possibilities are shown. Recently, these grids are modified with inclusion of distributed generation. An example of modification and application of the IEEE 13-bus for testing effects of faults in cases without and with a distributed generator connection to the grid is presented. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 042004: Smart Electricity Distribution Grids Based on Distribution Management System and Distributed Generation

  3. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  4. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  5. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  6. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  7. Distribution of base rock depth estimated from Rayleigh wave measurement by forced vibration tests

    International Nuclear Information System (INIS)

    Hiroshi Hibino; Toshiro Maeda; Chiaki Yoshimura; Yasuo Uchiyama

    2005-01-01

    This paper shows an application of Rayleigh wave methods to a real site, which was performed to determine spatial distribution of base rock depth from the ground surface. At a certain site in Sagami Plain in Japan, the base rock depth from surface is assumed to be distributed up to 10 m according to boring investigation. Possible accuracy of the base rock depth distribution has been needed for the pile design and construction. In order to measure Rayleigh wave phase velocity, forced vibration tests were conducted with a 500 N vertical shaker and linear arrays of three vertical sensors situated at several points in two zones around the edges of the site. Then, inversion analysis was carried out for soil profile by genetic algorithm, simulating measured Rayleigh wave phase velocity with the computed counterpart. Distribution of the base rock depth inverted from the analysis was consistent with the roughly estimated inclination of the base rock obtained from the boring tests, that is, the base rock is shallow around edge of the site and gradually inclines towards the center of the site. By the inversion analysis, the depth of base rock was determined as from 5 m to 6 m in the edge of the site, 10 m in the center of the site. The determined distribution of the base rock depth by this method showed good agreement on most of the points where boring investigation were performed. As a result, it was confirmed that the forced vibration tests on the ground by Rayleigh wave methods can be useful as the practical technique for estimating surface soil profiles to a depth of up to 10 m. (authors)

  8. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  9. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  10. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  11. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  12. Post-test analysis of ROSA-III experiment Run 702

    International Nuclear Information System (INIS)

    Koizumi, Yasuo; Kikuchi, Osamu; Soda, Kunihisa

    1980-01-01

    The purpose of the ROSA-III experiment with a scaled BWR test facility is to examine primary coolant thermal-hydraulic behavior and performance of ECCS during a posturated loss-of-coolant accident of BWR. The results provide information for verification and improvement of reactor safety analysis codes. Run 702 assumed a 200% split break at the recirculation pump suction line under an average core power without ECCS activation. Post - test analysis of the Run 702 experiment was made with computer code RELAP4J. Agreement of the calculated system pressure and the experiment one was good. However, the calculated heater surface temperatures were higher than the measured ones. Also, the axial temperature distribution was different in tendency from the experimental one. From these results, the necessity was indicated of improving the analytical model of void distribution in the core and the nodalization in the pressure vassel, in order to make the analysis more realistic. And also, the need of characteristic test was indicated for ROSA-III test facility components, such as jet pump and piping form loss coefficient; likewise, flow rate measurements must be increased and refined. (author)

  13. Asymptotically Distribution-Free Goodness-of-Fit Testing for Copulas

    NARCIS (Netherlands)

    Can, S.U.; Einmahl, John; Laeven, R.J.A.

    2017-01-01

    Consider a random sample from a continuous multivariate distribution function F with copula C. In order to test the null hypothesis that C belongs to a certain parametric family, we construct an under H0 asymptotically distribution-free process that serves as a tests generator. The process is a

  14. 21 CFR 211.165 - Testing and release for distribution.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Laboratory Controls § 211.165 Testing and release for distribution. (a) For each batch of drug product, there shall be... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Testing and release for distribution. 211.165...

  15. Test Protocol for Room-to-Room Distribution of Outside Air by Residential Ventilation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Barley, C. D.; Anderson, R.; Hendron, B.; Hancock, E.

    2007-12-01

    This test and analysis protocol has been developed as a practical approach for measuring outside air distribution in homes. It has been used successfully in field tests and has led to significant insights on ventilation design issues. Performance advantages of more sophisticated ventilation systems over simpler, less-costly designs have been verified, and specific problems, such as airflow short-circuiting, have been identified.

  16. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  17. Statistical test for the distribution of galaxies on plates

    International Nuclear Information System (INIS)

    Garcia Lambas, D.

    1985-01-01

    A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)

  18. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  19. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  20. Sodium flow distribution test of the air cooler tubes

    International Nuclear Information System (INIS)

    Uchida, Hiroyuki; Ohta, Hidehisa; Shimazu, Hisashi

    1980-01-01

    In the heat transfer tubes of the air cooler which is installed in the auxiliary core cooling system of the fast breeder prototype plant reactor ''Monju'', sodium freezing may be caused by undercooling the sodium induced by an extremely unbalanced sodium flow in the tubes. Thus, the sodium flow distribution test of the air cooler tubes was performed to examine the flow distribution of the tubes and to estimate the possibility of sodium freezing in the tubes. This test was performed by using a one fourth air cooler model installed in the water flow test facility. As the test results show, the flow distribution from the inlet header to each tube is almost equal at any operating condition, that is, the velocity deviation from normalized mean velocity is less than 6% and sodium freezing does not occur up to 250% air velocity deviation at stand-by condition. It was clear that the proposed air cooler design for the ''Monju'' will have a good sodium flow distribution at any operating condition. (author)

  1. Formability analysis of sheet metals by cruciform testing

    Science.gov (United States)

    Güler, B.; Alkan, K.; Efe, M.

    2017-09-01

    Cruciform biaxial tests are increasingly becoming popular for testing the formability of sheet metals as they achieve frictionless, in-plane, multi-axial stress states with a single sample geometry. However, premature fracture of the samples during testing prevents large strain deformation necessary for the formability analysis. In this work, we introduce a miniature cruciform sample design (few mm test region) and a test setup to achieve centre fracture and large uniform strains. With its excellent surface finish and optimized geometry, the sample deforms with diagonal strain bands intersecting at the test region. These bands prevent local necking and concentrate the strains at the sample centre. Imaging and strain analysis during testing confirm the uniform strain distributions and the centre fracture are possible for various strain paths ranging from plane-strain to equibiaxial tension. Moreover, the sample deforms without deviating from the predetermined strain ratio at all test conditions, allowing formability analysis under large strains. We demonstrate these features of the cruciform test for three sample materials: Aluminium 6061-T6 alloy, DC-04 steel and Magnesium AZ31 alloy, and investigate their formability at both the millimetre scale and the microstructure scale.

  2. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  3. Goodness-of-fit tests for a heavy tailed distribution

    NARCIS (Netherlands)

    A.J. Koning (Alex); L. Peng (Liang)

    2005-01-01

    textabstractFor testing whether a distribution function is heavy tailed, we study the Kolmogorov test, Berk-Jones test, score test and their integrated versions. A comparison is conducted via Bahadur efficiency and simulations. The score test and the integrated score test show the best

  4. Multipath interference test method for distributed amplifiers

    Science.gov (United States)

    Okada, Takahiro; Aida, Kazuo

    2005-12-01

    A method for testing distributed amplifiers is presented; the multipath interference (MPI) is detected as a beat spectrum between the multipath signal and the direct signal using a binary frequency shifted keying (FSK) test signal. The lightwave source is composed of a DFB-LD that is directly modulated by a pulse stream passing through an equalizer, and emits the FSK signal of the frequency deviation of about 430MHz at repetition rate of 80-100 kHz. The receiver consists of a photo-diode and an electrical spectrum analyzer (ESA). The base-band power spectrum peak appeared at the frequency of the FSK frequency deviation can be converted to amount of MPI using a calibration chart. The test method has improved the minimum detectable MPI as low as -70 dB, compared to that of -50 dB of the conventional test method. The detailed design and performance of the proposed method are discussed, including the MPI simulator for calibration procedure, computer simulations for evaluating the error caused by the FSK repetition rate and the fiber length under test and experiments on singlemode fibers and distributed Raman amplifier.

  5. Effect of distributive mass of spring on power flow in engineering test

    Science.gov (United States)

    Sheng, Meiping; Wang, Ting; Wang, Minqing; Wang, Xiao; Zhao, Xuan

    2018-06-01

    Mass of spring is always neglected in theoretical and simulative analysis, while it may be a significance in practical engineering. This paper is concerned with the distributive mass of a steel spring which is used as an isolator to simulate isolation performance of a water pipe in a heating system. Theoretical derivation of distributive mass effect of steel spring on vibration is presented, and multiple eigenfrequencies are obtained, which manifest that distributive mass results in extra modes and complex impedance properties. Furthermore, numerical simulation visually shows several anti-resonances of the steel spring corresponding to impedance and power flow curves. When anti-resonances emerge, the spring collects large energy which may cause damage and unexpected consequences in practical engineering and needs to be avoided. Finally, experimental tests are conducted and results show consistency with that of the simulation of the spring with distributive mass.

  6. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  7. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  8. Materials Science Research Rack-1 Fire Suppressant Distribution Test Report

    Science.gov (United States)

    Wieland, P. O.

    2002-01-01

    Fire suppressant distribution testing was performed on the Materials Science Research Rack-1 (MSRR-1), a furnace facility payload that will be installed in the U.S. Lab module of the International Space Station. Unlike racks that were tested previously, the MSRR-1 uses the Active Rack Isolation System (ARIS) to reduce vibration on experiments, so the effects of ARIS on fire suppressant distribution were unknown. Two tests were performed to map the distribution of CO2 fire suppressant throughout a mockup of the MSRR-1 designed to have the same component volumes and flowpath restrictions as the flight rack. For the first test, the average maximum CO2 concentration for the rack was 60 percent, achieved within 45 s of discharge initiation, meeting the requirement to reach 50 percent throughout the rack within 1 min. For the second test, one of the experiment mockups was removed to provide a worst-case configuration, and the average maximum CO2 concentration for the rack was 58 percent. Comparing the results of this testing with results from previous testing leads to several general conclusions that can be used to evaluate future racks. The MSRR-1 will meet the requirements for fire suppressant distribution. Primary factors that affect the ability to meet the CO2 distribution requirements are the free air volume in the rack and the total area and distribution of openings in the rack shell. The length of the suppressant flowpath and degree of tortuousness has little correlation with CO2 concentration. The total area of holes in the rack shell could be significantly increased. The free air volume could be significantly increased. To ensure the highest maximum CO2 concentration, the PFE nozzle should be inserted to the stop on the nozzle.

  9. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  10. An adaptive Mantel-Haenszel test for sensitivity analysis in observational studies.

    Science.gov (United States)

    Rosenbaum, Paul R; Small, Dylan S

    2017-06-01

    In a sensitivity analysis in an observational study with a binary outcome, is it better to use all of the data or to focus on subgroups that are expected to experience the largest treatment effects? The answer depends on features of the data that may be difficult to anticipate, a trade-off between unknown effect-sizes and known sample sizes. We propose a sensitivity analysis for an adaptive test similar to the Mantel-Haenszel test. The adaptive test performs two highly correlated analyses, one focused analysis using a subgroup, one combined analysis using all of the data, correcting for multiple testing using the joint distribution of the two test statistics. Because the two component tests are highly correlated, this correction for multiple testing is small compared with, for instance, the Bonferroni inequality. The test has the maximum design sensitivity of two component tests. A simulation evaluates the power of a sensitivity analysis using the adaptive test. Two examples are presented. An R package, sensitivity2x2xk, implements the procedure. © 2016, The International Biometric Society.

  11. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  12. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  13. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  14. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  15. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  16. Renewable Distributed Generation Models in Three-Phase Load Flow Analysis for Smart Grid

    Directory of Open Access Journals (Sweden)

    K. M. Nor

    2013-11-01

    Full Text Available The paper presents renewable distributed generation  (RDG models as three-phase resource in load flow computation and analyzes their effect when they are connected in composite networks. The RDG models that have been considered comprise of photovoltaic (PV and wind turbine generation (WTG. The voltage-controlled node and complex power injection node are used in the models. These improvement models are suitable for smart grid power system analysis. The combination of IEEE transmission and distribution data used to test and analyze the algorithm in solving balanced/unbalanced active systems. The combination of IEEE transmission data and IEEE test feeder are used to test the the algorithm for balanced and unbalanced multi-phase distribution system problem. The simulation results show that by increased number and size of RDG units have improved voltage profile and reduced system losses.

  17. 242A Distributed Control System Year 2000 Acceptance Test Report

    Energy Technology Data Exchange (ETDEWEB)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.

  18. 242A Distributed Control System Year 2000 Acceptance Test Report

    International Nuclear Information System (INIS)

    TEATS, M.C.

    1999-01-01

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3(trademark) (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each

  19. A multivariate rank test for comparing mass size distributions

    KAUST Repository

    Lombard, F.

    2012-04-01

    Particle size analyses of a raw material are commonplace in the mineral processing industry. Knowledge of particle size distributions is crucial in planning milling operations to enable an optimum degree of liberation of valuable mineral phases, to minimize plant losses due to an excess of oversize or undersize material or to attain a size distribution that fits a contractual specification. The problem addressed in the present paper is how to test the equality of two or more underlying size distributions. A distinguishing feature of these size distributions is that they are not based on counts of individual particles. Rather, they are mass size distributions giving the fractions of the total mass of a sampled material lying in each of a number of size intervals. As such, the data are compositional in nature, using the terminology of Aitchison [1] that is, multivariate vectors the components of which add to 100%. In the literature, various versions of Hotelling\\'s T 2 have been used to compare matched pairs of such compositional data. In this paper, we propose a robust test procedure based on ranks as a competitor to Hotelling\\'s T 2. In contrast to the latter statistic, the power of the rank test is not unduly affected by the presence of outliers or of zeros among the data. © 2012 Copyright Taylor and Francis Group, LLC.

  20. A survey of residual analysis and a new test of residual trend.

    Science.gov (United States)

    McDowell, J J; Calvin, Olivia L; Klapes, Bryan

    2016-05-01

    A survey of residual analysis in behavior-analytic research reveals that existing methods are problematic in one way or another. A new test for residual trends is proposed that avoids the problematic features of the existing methods. It entails fitting cubic polynomials to sets of residuals and comparing their effect sizes to those that would be expected if the sets of residuals were random. To this end, sampling distributions of effect sizes for fits of a cubic polynomial to random data were obtained by generating sets of random standardized residuals of various sizes, n. A cubic polynomial was then fitted to each set of residuals and its effect size was calculated. This yielded a sampling distribution of effect sizes for each n. To test for a residual trend in experimental data, the median effect size of cubic-polynomial fits to sets of experimental residuals can be compared to the median of the corresponding sampling distribution of effect sizes for random residuals using a sign test. An example from the literature, which entailed comparing mathematical and computational models of continuous choice, is used to illustrate the utility of the test. © 2016 Society for the Experimental Analysis of Behavior.

  1. The Test for Flow Characteristics of Tubular Fuel Assembly(II) - Experimental results and CFD analysis

    International Nuclear Information System (INIS)

    Park, Jong Hark; Chae, H. T.; Park, C.; Kim, H.

    2006-12-01

    A test facility had been established for the experiment of velocity distribution and pressure drop in a tubular fuel. A basic test had been conducted to examine the performance of the test loop and to verify the accuracy of measurement by pitot-tube. In this report, test results and CFD analysis for the hydraulic characteristics of a tubular fuel, following the previous tests, are described. Coolant velocities in all channels were measured using pitot-tube and the effect of flow rate change on the velocity distribution was also examined. The pressure drop through the tubular fuel was measured for various flow rates in range of 1 kg/s to 21 kg/s to obtain a correlation of pressure drop with variation of flow rate. In addition, a CFD(Computational Fluid Dynamics) analysis was also done to find out the hydraulic characteristics of tubular fuel such as velocity distribution and pressure drop. As the results of CFD analysis can give us a detail insight on coolant flow in the tubular fuel, the CFD method is a very useful tool to understand the flow structure and phenomena induced by fluid flow. The CFX-10, a commercial CFD code, was used in this study. The two results by the experiment and the CFD analysis were investigated and compared with each other. Overall trend of velocity distribution by CFD analysis was somewhat different from that of experiment, but it would be reasonable considering measurement uncertainties. The CFD prediction for pressure drop of a tubular fuel shows a tolerably good agreement with experiment within 8% difference

  2. Uncovering Bugs in Distributed Storage Systems during Testing (not in Production!)

    OpenAIRE

    Deligiannis, P; McCutchen, M; Thomson, P; Chen, S; Donaldson, AF; Erickson, J; Huang, C; Lal, A; Mudduluru, R; Qadeer, S; Schulte, W

    2016-01-01

    Testing distributed systems is challenging due to multiple sources of nondeterminism. Conventional testing techniques, such as unit, integration and stress testing, are ineffective in preventing serious but subtle bugs from reaching production. Formal techniques, such as TLA+, can only verify high-level specifications of systems at the level of logic-based models, and fall short of checking the actual executable code. In this paper, we present a new methodology for testing distributed systems...

  3. Project W-320 acceptance test report for AY-farm electrical distribution

    International Nuclear Information System (INIS)

    Bevins, R.R.

    1998-01-01

    This Acceptance Test Procedure (ATP) has been prepared to demonstrate that the AY-Farm Electrical Distribution System functions as required by the design criteria. This test is divided into three parts to support the planned construction schedule; Section 8 tests Mini-Power Pane AY102-PPI and the EES; Section 9 tests the SSS support systems; Section 10 tests the SSS and the Multi-Pak Group Control Panel. This test does not include the operation of end-use components (loads) supplied from the distribution system. Tests of the end-use components (loads) will be performed by other W-320 ATPs

  4. Testing the Pareto against the lognormal distributions with the uniformly most powerful unbiased test applied to the distribution of cities.

    Science.gov (United States)

    Malevergne, Yannick; Pisarenko, Vladilen; Sornette, Didier

    2011-03-01

    Fat-tail distributions of sizes abound in natural, physical, economic, and social systems. The lognormal and the power laws have historically competed for recognition with sometimes closely related generating processes and hard-to-distinguish tail properties. This state-of-affair is illustrated with the debate between Eeckhout [Amer. Econ. Rev. 94, 1429 (2004)] and Levy [Amer. Econ. Rev. 99, 1672 (2009)] on the validity of Zipf's law for US city sizes. By using a uniformly most powerful unbiased (UMPU) test between the lognormal and the power-laws, we show that conclusive results can be achieved to end this debate. We advocate the UMPU test as a systematic tool to address similar controversies in the literature of many disciplines involving power laws, scaling, "fat" or "heavy" tails. In order to demonstrate that our procedure works for data sets other than the US city size distribution, we also briefly present the results obtained for the power-law tail of the distribution of personal identity (ID) losses, which constitute one of the major emergent risks at the interface between cyberspace and reality.

  5. Post-test analysis for the APR1400 LBLOCA DVI performance test using MARS

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Lee, Y. J.; Kim, H. C.; Bae, Y. Y.; Park, J. K.; Lee, W.

    2002-03-01

    Post-test analyses using a multi-dimensional best-estimate analysis code, MARS, are performed for the APR1400 LBLOCA DVI (Direct Vessel Injection) performance tests. This report describes the code evaluation results for the test data of various void height tests and direct bypass tests that have been performed at MIDAS test facility. MIDAS is a scaled test facility of APR1400 with the objective of identifying multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. A modified linear scale ratio was applied in its construction and test conditions. The major thermal-hydraulic parameters such as ECC bypass fraction, steam condensation fraction, and temperature distributions in downcomer are compared and evaluated. The evaluation results of MARS code for the various test cases show that: (a) MARS code has an advanced modeling capability of well predicting major multi-dimensional thermal-hydraulic phenomena occurring in the downcomer, (b) MARS code under-predicts the steam condensation rates, which in turn causes to over-predict the ECC bypass rates. However, the trend of decrease in steam condensation rate and increase in ECC bypass rate in accordance with the increase in steam flow rate, and the calculation results of the ECC bypass rates under the EM analysis conditions generally agree with the test data

  6. Comparisons of uniform and discrete source distributions for use in bioassay laboratory performance testing

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; MacLellan, J.A.

    1987-09-01

    The Pacific Northwest Laboratory (PNL) is sending a torso phantom with radioactive material uniformly distributed in the lungs to in vivo bioassay laboratories for analysis. Although the radionuclides ultimately chosen for the studies had relatively long half-lives, future accreditation testing will require repeated tests with short half-life test nuclides. Computer modeling was used to simulate the major components of the phantom. Radiation transport calculations were then performed using the computer models to calculate dose rates either 15 cm from the chest or at its surface. For 144 Ce and 60 Co, three configurations were used for the lung comparison tests. Calculations show that, for most detector positions, a single plug containing 40 K located in the back of the heart provides a good approximation to a uniform distribution of 40 K. The approximation would lead, however, to a positive bias for the detector reading if the detector were located at the chest surface near the center. Loading the 40 K in a uniform layer inside the chest wall is not a good approximation of the uniform distribution in the lungs, because most of the radionuclides would be situated close to the detector location and the only shielding would be the thickness of the chest wall. The calculated dose rates for 60 Co and 144 Ce were similar at all calculated reference points. 3 refs., 5 figs., 10 tabs

  7. Transient dynamic finite element analysis of hydrogen distribution test chamber structure for hydrogen combustion loads

    International Nuclear Information System (INIS)

    Singh, R.K.; Redlinger, R.; Breitung, W.

    2005-09-01

    Design and analysis of blast resistant structures is an important area of safety research in nuclear, aerospace, chemical process and vehicle industries. Institute for Nuclear and Energy Technologies (IKET) of Research Centre- Karlsruhe (Forschungszentrum Karlsruhe or FZK) in Germany is pursuing active research on the entire spectrum of safety evaluation for efficient hydrogen management in case of the postulated design basis and beyond the design basis severe accidents for nuclear and non-nuclear applications. This report concentrates on the consequence analysis of hydrogen combustion accidents with emphasis on the structural safety assessment. The transient finite element simulation results obtained for 2gm, 4gm, 8gm and 16gm hydrogen combustion experiments concluded recently on the test-cell structure are described. The frequencies and damping of the test-cell observed during the hammer tests and the combustion experiments are used for the present three dimensional finite element model qualification. For the numerical transient dynamic evaluation of the test-cell structure, the pressure time history data computed with CFD code COM-3D is used for the four combustion experiments. Detail comparisons of the present numerical results for the four combustion experiments with the observed time signals are carried out to evaluate the structural connection behavior. For all the combustion experiments excellent agreement is noted for the computed accelerations and displacements at the standard transducer locations, where the measurements were made during the different combustion tests. In addition inelastic analysis is also presented for the test-cell structure to evaluate the limiting impulsive and quasi-static pressure loads. These results are used to evaluate the response of the test cell structure for the postulated over pressurization of the test-cell due to the blast load generated in case of 64 gm hydrogen ignition for which additional sets of computations were

  8. Extension of the pseudo dynamic method to test structures with distributed mass

    International Nuclear Information System (INIS)

    Renda, V.; Papa, L.; Bellorini, S.

    1993-01-01

    The PsD method is a mixed numerical and experimental procedure. At each time step the dynamic deformation of the structure, computed by solving the equation of the motion for a given input signal, is reproduced in the laboratory by means of actuators attached to the sample at specific points. The reaction forces at those points are measured and used to compute the deformation for the next time step. The reaction forces being known, knowledge of the stiffness of the structure is not needed, so that the method can be effective also for deformations leading to strong nonlinear behaviour of the structure. On the contrary, the mass matrix and the applied forces must be well known. For this reason the PsD method can be applied without approximations when the masses can be considered as lumped at the testing points of the sample. The present work investigates the possibility to extend the PsD method to test structures with distributed mass. A standard procedure is proposed to provide an equivalent mass matrix and force vector reduced to the testing points and to verify the reliability of the model. The verification is obtained comparing the results of multi-degrees of freedom dynamic analysis, done by means of a Finite Elements (FE) numerical program, with a simulation of the PsD method based on the reduced degrees of freedom mass matrix and external forces, assuming in place of the experimental reactions, those computed with the general FE model. The method has been applied to a numerical simulation of the behaviour of a realistic and complex structure with distributed mass consisting of a masonry building of two floors. The FE model consists of about two thousand degrees of freedom and the condensation has been made for four testing points. A dynamic analysis has been performed with the general FE model and the reactions of the structure have been recorded in a file and used as input for the PsD simulation with the four degree of freedom model. The comparison between

  9. Observations in the statistical analysis of NBG-18 nuclear graphite strength tests

    International Nuclear Information System (INIS)

    Hindley, Michael P.; Mitchell, Mark N.; Blaine, Deborah C.; Groenwold, Albert A.

    2012-01-01

    Highlights: ► Statistical analysis of NBG-18 nuclear graphite strength test. ► A Weibull distribution and normal distribution is tested for all data. ► A Bimodal distribution in the CS data is confirmed. ► The CS data set has the lowest variance. ► A Combined data set is formed and has Weibull distribution. - Abstract: The purpose of this paper is to report on the selection of a statistical distribution chosen to represent the experimental material strength of NBG-18 nuclear graphite. Three large sets of samples were tested during the material characterisation of the Pebble Bed Modular Reactor and Core Structure Ceramics materials. These sets of samples are tensile strength, flexural strength and compressive strength (CS) measurements. A relevant statistical fit is determined and the goodness of fit is also evaluated for each data set. The data sets are also normalised for ease of comparison, and combined into one representative data set. The validity of this approach is demonstrated. A second failure mode distribution is found on the CS test data. Identifying this failure mode supports the similar observations made in the past. The success of fitting the Weibull distribution through the normalised data sets allows us to improve the basis for the estimates of the variability. This could also imply that the variability on the graphite strength for the different strength measures is based on the same flaw distribution and thus a property of the material.

  10. Test report light duty utility arm power distribution system (PDS)

    International Nuclear Information System (INIS)

    Clark, D.A.

    1996-01-01

    The Light Duty Utility Arm (LDUA) Power Distribution System has completed vendor and post-delivery acceptance testing. The Power Distribution System has been found to be acceptable and is now ready for integration with the overall LDUA system

  11. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  12. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  13. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  14. Overview of Current Turbine Aerodynamic Analysis and Testing at MSFC

    Science.gov (United States)

    Griffin, Lisa W.; Hudson, Susan T.; Zoladz, Thomas F.

    1999-01-01

    An overview of the current turbine aerodynamic analysis and testing activities at NASA/Marshall Space Flight Center (MSFC) is presented. The presentation is divided into three areas. The first area is the three-dimensional (3D), unsteady Computational Fluid Dynamics (CFD) analysis of the Fastrac turbine. Results from a coupled nozzle, blade, and exit guide vane analysis and from an uncoupled nozzle and coupled blade and exit guide vane will be presented. Unsteady pressure distributions, frequencies, and exit profiles from each analysis will be compared and contrasted. The second area is the testing and analysis of the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP) turbine with instrumented first stage blades. The SSME HPFTP turbine was tested in air at the MSFC Turbine Test Equipment (TTE). Pressure transducers were mounted on the first stage blades. Unsteady, 3D CFD analysis was performed for this geometry and flow conditions. A sampling of the results will be shown. The third area is a status of the Turbine Performance Optimization task. The objective of this task is to improve the efficiency of a turbine for potential use on a next generation launch vehicle. This task includes global optimization for the preliminary design, detailed optimization for blade shapes and spacing, and application of advanced CFD analysis. The final design will be tested in the MSFC TTE.

  15. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  16. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  17. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  18. Geographically distributed hybrid testing & collaboration between geotechnical centrifuge and structures laboratories

    Science.gov (United States)

    Ojaghi, Mobin; Martínez, Ignacio Lamata; Dietz, Matt S.; Williams, Martin S.; Blakeborough, Anthony; Crewe, Adam J.; Taylor, Colin A.; Madabhushi, S. P. Gopal; Haigh, Stuart K.

    2018-01-01

    Distributed Hybrid Testing (DHT) is an experimental technique designed to capitalise on advances in modern networking infrastructure to overcome traditional laboratory capacity limitations. By coupling the heterogeneous test apparatus and computational resources of geographically distributed laboratories, DHT provides the means to take on complex, multi-disciplinary challenges with new forms of communication and collaboration. To introduce the opportunity and practicability afforded by DHT, here an exemplar multi-site test is addressed in which a dedicated fibre network and suite of custom software is used to connect the geotechnical centrifuge at the University of Cambridge with a variety of structural dynamics loading apparatus at the University of Oxford and the University of Bristol. While centrifuge time-scaling prevents real-time rates of loading in this test, such experiments may be used to gain valuable insights into physical phenomena, test procedure and accuracy. These and other related experiments have led to the development of the real-time DHT technique and the creation of a flexible framework that aims to facilitate future distributed tests within the UK and beyond. As a further example, a real-time DHT experiment between structural labs using this framework for testing across the Internet is also presented.

  19. Numerical distribution functions of fractional unit root and cointegration tests

    DEFF Research Database (Denmark)

    MacKinnon, James G.; Nielsen, Morten Ørregaard

    We calculate numerically the asymptotic distribution functions of likelihood ratio tests for fractional unit roots and cointegration rank. Because these distributions depend on a real-valued parameter, b, which must be estimated, simple tabulation is not feasible. Partly due to the presence...

  20. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  1. Statistical Analysis of Video Frame Size Distribution Originating from Scalable Video Codec (SVC

    Directory of Open Access Journals (Sweden)

    Sima Ahmadpour

    2017-01-01

    Full Text Available Designing an effective and high performance network requires an accurate characterization and modeling of network traffic. The modeling of video frame sizes is normally applied in simulation studies and mathematical analysis and generating streams for testing and compliance purposes. Besides, video traffic assumed as a major source of multimedia traffic in future heterogeneous network. Therefore, the statistical distribution of video data can be used as the inputs for performance modeling of networks. The finding of this paper comprises the theoretical definition of distribution which seems to be relevant to the video trace in terms of its statistical properties and finds the best distribution using both the graphical method and the hypothesis test. The data set used in this article consists of layered video traces generating from Scalable Video Codec (SVC video compression technique of three different movies.

  2. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  3. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  4. 10 CFR 431.193 - Test procedures for measuring energy consumption of distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Test procedures for measuring energy consumption of distribution transformers. 431.193 Section 431.193 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY... § 431.193 Test procedures for measuring energy consumption of distribution transformers. The test...

  5. Assessment of the Nevada Test Site as a Site for Distributed Resource Testing and Project Plan: March 2002

    Energy Technology Data Exchange (ETDEWEB)

    Horgan, S.; Iannucci, J.; Whitaker, C.; Cibulka, L.; Erdman, W.

    2002-05-01

    The objective of this project was to evaluate the Nevada Test Site (NTS) as a location for performing dedicated, in-depth testing of distributed resources (DR) integrated with the electric distribution system. In this large scale testing, it is desired to operate multiple DRs and loads in an actual operating environment, in a series of controlled tests to concentrate on issues of interest to the DR community. This report includes an inventory of existing facilities at NTS, an assessment of site attributes in relation to DR testing requirements, and an evaluation of the feasibility and cost of upgrades to the site that would make it a fully qualified DR testing facility.

  6. A CLASS OF DISTRIBUTION-FREE TESTS FOR INDEPENDENCE AGAINST POSITIVE QUADRANT DEPENDENCE

    Directory of Open Access Journals (Sweden)

    Parameshwar V Pandit

    2014-02-01

    Full Text Available A class of distribution-free tests based on convex combination of two U-statistics is considered for testing independence against positive quadrant dependence. The class of tests proposed by Kochar and Gupta (1987 and Kendall’s test are members of the proposed class. The performance of the proposed class is evaluated in terms of Pitman asymptotic relative efficiency for Block- Basu (1974 model and Woodworth family of distributions. It has been observed that some members of the class perform better than the existing tests in the literature.  Unbiasedness and consistency of the proposed class of tests have been established.

  7. In-core flow rate distribution measurement test of the JOYO irradiation core

    International Nuclear Information System (INIS)

    Suzuki, Toshihiro; Isozaki, Kazunori; Suzuki, Soju

    1996-01-01

    A flow rate distribution measurement test was carried out for the JOYO irradiation core (the MK-II core) after the 29th duty cycle operation. The main object of the test is to confirm the proper flow rate distribution at the final phase of the MK-II core. The each flow rate at the outlet of subassemblies was measured by the permanent magnetic flowmeter inserted avail of fuel exchange hole in the rotating plug. This is third test in the MK-II core, after 10 years absence from the final test (1985). Total of 550 subassemblies were exchanged and accumulated reactor operation time reached up to 38,000 hours from the previous test. As a conclusion, it confirmed that the flow rate distribution has been kept suitable in the final phase of the MK-II core. (author)

  8. Loss optimization in distribution networks with distributed generation

    DEFF Research Database (Denmark)

    Pokhrel, Basanta Raj; Nainar, Karthikeyan; Bak-Jensen, Birgitte

    2017-01-01

    This paper presents a novel power loss minimization approach in distribution grids considering network reconfiguration, distributed generation and storage installation. Identification of optimum configuration in such scenario is one of the main challenges faced by distribution system operators...... in highly active distribution grids. This issue is tackled by formulating a hybrid loss optimization problem and solved using the Interior Point Method. Sensitivity analysis is used to identify the optimum location of storage units. Different scenarios of reconfiguration, storage and distributed generation...... penetration are created to test the proposed algorithm. It is tested in a benchmark medium voltage network to show the effectiveness and performance of the algorithm. Results obtained are found to be encouraging for radial distribution system. It shows that we can reduce the power loss by more than 30% using...

  9. Posterior cerebral artery Wada test: sodium amytal distribution and functional deficits

    Energy Technology Data Exchange (ETDEWEB)

    Urbach, H.; Schild, H.H. [Dept. of Radiology/Neuroradiology, Univ. of Bonn (Germany); Klemm, E.; Biersack, H.J. [Bonn Univ. (Germany). Klinik fuer Nuklearmedizin; Linke, D.B.; Behrends, K.; Schramm, J. [Dept. of Neurosurgery, Univ. of Bonn (Germany)

    2001-04-01

    Inadequate sodium amytal delivery to the posterior hippocampus during the intracarotid Wada test has led to development of selective tests. Our purpose was to show the sodium amytal distribution in the posterior cerebral artery (PCA) Wada test and to relate it to functional deficits during the test. We simultaneously injected 80 mg sodium amytal and 14.8 MBq {sup 99} {sup m}Tc-hexamethylpropyleneamine oxime (HMPAO) into the P2-segment of the PCA in 14 patients with temporal lobe epilepsy. To show the skull, we injected 116 MBq {sup 99} {sup m}Tc-HDP intravenously. Sodium amytal distribution was determined by high-resolution single-photon emission computed tomography (SPECT). In all patients, HMPAO was distributed throughout the parahippocampal gyrus and hippocampus; it was also seen in the occipital lobe in all cases and in the thalamus in 11. Eleven patients were awake and cooperative; one was slightly uncooperative due to speech comprehension difficulties and perseveration. All patients showed contralateral hemianopia during the test. Four patients had nominal dysphasia for 1-3 min. None developed motor deficits or had permanent neurological deficits. Neurological deficits due to inactivation of extrahippocampal areas thus do not grossly interfere with neuropsychological testing during the test. (orig.)

  10. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  11. Thermal test and analysis of a spent fuel storage cask

    International Nuclear Information System (INIS)

    Yamakawa, H.; Gomi, Y.; Ozaki, S.; Kosaki, A.

    1993-01-01

    A thermal test simulated with full-scale cask model for the normal storage was performed to verify the storage skill of the spent fuels of the cask. The maximum temperature at each point in the test was lower than the allowable temperature. The integrity of the cask was maintained. It was observed that the safety of containment system was also kept according to the check of the seal before and after the thermal test. Therefore it was shown that using the present skill, it is possible to store spent fuels in the dry-type cask safely. Moreover, because of the good agreement between analysis and experimental results, it was shown that the analysis model was successfully established to estimate the temperature distribution of the fuel cladding and the seal portion. (J.P.N.)

  12. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  13. TESTING THE GRAIN-SIZE DISTRIBUTION DETERMINED BY LASER DIFFRACTOMETRY FOR SICILIAN SOILS

    Directory of Open Access Journals (Sweden)

    Costanza Di Stefano

    2012-06-01

    Full Text Available In this paper the soil grain-size distribution determined by Laser Diffraction method (LDM is tested using the Sieve-Hydrometer method (SHM applied for 747 soil samples representing a different texture classification, sampled in Sicily. 005_Di_Stefano(599_39 28-12-2011 15:01 Pagina 45 The analysis showed that the sand content measured by SHM can be assumed equal to the one determined by LDM. An underestimation of the clay fraction measured by LDM was obtained with respect to the SHM and a set of equations useful to refer laser diffraction measurements to SHM was calibrated using the measurements carried out for 635 soil samples. Finally, the proposed equations were tested using independent measurements carried out by LDM and SHM for 112 soil samples with a different texture classification.

  14. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  15. Distributed Rocket Engine Testing Health Monitoring System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The on-ground and Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) provides a system architecture and software tools for performing diagnostics...

  16. Distributed Rocket Engine Testing Health Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Leveraging the Phase I achievements of the Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) including its software toolsets and system building...

  17. Experimental investigation of localized stress-induced leakage current distribution in gate dielectrics using array test circuit

    Science.gov (United States)

    Park, Hyeonwoo; Teramoto, Akinobu; Kuroda, Rihito; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Localized stress-induced leakage current (SILC) has become a major problem in the reliability of flash memories. To reduce it, clarifying the SILC mechanism is important, and statistical measurement and analysis have to be carried out. In this study, we applied an array test circuit that can measure the SILC distribution of more than 80,000 nMOSFETs with various gate areas at a high speed (within 80 s) and a high accuracy (on the 10-17 A current order). The results clarified that the distributions of localized SILC in different gate areas follow a universal distribution assuming the same SILC defect density distribution per unit area, and the current of localized SILC defects does not scale down with the gate area. Moreover, the distribution of SILC defect density and its dependence on the oxide field for measurement (E OX-Measure) were experimentally determined for fabricated devices.

  18. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  19. Wavefront-error evaluation by mathematical analysis of experimental Foucault-test data

    Science.gov (United States)

    Wilson, R. G.

    1975-01-01

    The diffraction theory of the Foucault test provides an integral formula expressing the complex amplitude and irradiance distribution in the Foucault pattern of a test mirror (lens) as a function of wavefront error. Recent literature presents methods of inverting this formula to express wavefront error in terms of irradiance in the Foucault pattern. The present paper describes a study in which the inversion formulation was applied to photometric Foucault-test measurements on a nearly diffraction-limited mirror to determine wavefront errors for direct comparison with ones determined from scatter-plate interferometer measurements. The results affirm the practicability of the Foucault test for quantitative wavefront analysis of very small errors, and they reveal the fallacy of the prevalent belief that the test is limited to qualitative use only. Implications of the results with regard to optical testing and the potential use of the Foucault test for wavefront analysis in orbital space telescopes are discussed.

  20. statistical tests for frequency distribution of mean gravity anomalies

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...

  1. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  2. Statistical Analysis of Wave Climate Data Using Mixed Distributions and Extreme Wave Prediction

    Directory of Open Access Journals (Sweden)

    Wei Li

    2016-05-01

    Full Text Available The investigation of various aspects of the wave climate at a wave energy test site is essential for the development of reliable and efficient wave energy conversion technology. This paper presents studies of the wave climate based on nine years of wave observations from the 2005–2013 period measured with a wave measurement buoy at the Lysekil wave energy test site located off the west coast of Sweden. A detailed analysis of the wave statistics is investigated to reveal the characteristics of the wave climate at this specific test site. The long-term extreme waves are estimated from applying the Peak over Threshold (POT method on the measured wave data. The significant wave height and the maximum wave height at the test site for different return periods are also compared. In this study, a new approach using a mixed-distribution model is proposed to describe the long-term behavior of the significant wave height and it shows an impressive goodness of fit to wave data from the test site. The mixed-distribution model is also applied to measured wave data from four other sites and it provides an illustration of the general applicability of the proposed model. The methodologies used in this paper can be applied to general wave climate analysis of wave energy test sites to estimate extreme waves for the survivability assessment of wave energy converters and characterize the long wave climate to forecast the wave energy resource of the test sites and the energy production of the wave energy converters.

  3. A statistical test for outlier identification in data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-09-01

    Full Text Available In the use of peer group data to assess individual, typical or best practice performance, the effective detection of outliers is critical for achieving useful results. In these ‘‘deterministic’’ frontier models, statistical theory is now mostly available. This paper deals with the statistical pared sample method and its capability of detecting outliers in data envelopment analysis. In the presented method, each observation is deleted from the sample once and the resulting linear program is solved, leading to a distribution of efficiency estimates. Based on the achieved distribution, a pared test is designed to identify the potential outlier(s. We illustrate the method through a real data set. The method could be used in a first step, as an exploratory data analysis, before using any frontier estimation.

  4. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  5. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  6. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  7. A New Wind Turbine Generating System Model for Balanced and Unbalanced Distribution Systems Load Flow Analysis

    Directory of Open Access Journals (Sweden)

    Ahmet Koksoy

    2018-03-01

    Full Text Available Wind turbine generating systems (WTGSs, which are conventionally connected to high voltage transmission networks, have frequently been employed as distributed generation units in today’s distribution networks. In practice, the distribution networks always have unbalanced bus voltages and line currents due to uneven distribution of single or double phase loads over three phases and asymmetry of the lines, etc. Accordingly, in this study, for the load flow analysis of the distribution networks, Conventional Fixed speed Induction Generator (CFIG based WTGS, one of the most widely used WTGS types, is modelled under unbalanced voltage conditions. The Developed model has active and reactive power expressions in terms of induction machine impedance parameters, terminal voltages and input power. The validity of the Developed model is confirmed with the experimental results obtained in a test system. The results of the slip calculation based phase-domain model (SCP Model, which was previously proposed in the literature for CFIG based WTGSs under unbalanced voltages, are also given for the comparison. Finally, the Developed model and the SCP model are implemented in the load flow analysis of the IEEE 34 bus test system with the CFIG based WTGSs and unbalanced loads. Thus, it is clearly pointed out that the results of the load flow analysis implemented with both models are very close to each other, and the Developed model is computationally more efficient than the SCP model.

  8. Preliminary investigation on determination of radionuclide distribution in field tracing test site

    International Nuclear Information System (INIS)

    Tanaka, Tadao; Mukai, Masayuki; Takebe, Shinichi; Guo Zede; Li Shushen; Kamiyama, Hideo.

    1993-12-01

    Field tracing tests for radionuclide migration have been conducted by using 3 H, 60 Co, 85 Sr and 134 Cs, in the natural unsaturated loess zone at field test site of China Institute for Radiation Protection. It is necessary to obtain confidable distribution data of the radionuclides in the test site, in order to evaluate exactly the migration behavior of the radionuclides in situ. An available method to determine the distribution was proposed on the basis of preliminary discussing results on sampling method of soils from the test site and analytical method of radioactivity in the soils. (author)

  9. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  10. Uniform approximation is more appropriate for Wilcoxon Rank-Sum Test in gene set analysis.

    Directory of Open Access Journals (Sweden)

    Zhide Fang

    Full Text Available Gene set analysis is widely used to facilitate biological interpretations in the analyses of differential expression from high throughput profiling data. Wilcoxon Rank-Sum (WRS test is one of the commonly used methods in gene set enrichment analysis. It compares the ranks of genes in a gene set against those of genes outside the gene set. This method is easy to implement and it eliminates the dichotomization of genes into significant and non-significant in a competitive hypothesis testing. Due to the large number of genes being examined, it is impractical to calculate the exact null distribution for the WRS test. Therefore, the normal distribution is commonly used as an approximation. However, as we demonstrate in this paper, the normal approximation is problematic when a gene set with relative small number of genes is tested against the large number of genes in the complementary set. In this situation, a uniform approximation is substantially more powerful, more accurate, and less intensive in computation. We demonstrate the advantage of the uniform approximations in Gene Ontology (GO term analysis using simulations and real data sets.

  11. Finite element analysis and fracture resistance testing of a new intraradicular post

    Directory of Open Access Journals (Sweden)

    Eron Toshio Colauto Yamamoto

    2012-08-01

    Full Text Available OBJECTIVES: The objective of the present study was to evaluate a prefabricated intraradicular threaded pure titanium post, designed and developed at the São José dos Campos School of Dentistry - UNESP, Brazil. This new post was designed to minimize stresses observed with prefabricated post systems and to improve cost-benefits. MATERIAL AND METHODS: Fracture resistance testing of the post/core/root complex, fracture analysis by microscopy and stress analysis by the finite element method were used for post evaluation. The following four prefabricated metal post systems were analyzed: group 1, experimental post; group 2, modification of the experimental post; group 3, Flexi Post, and group 4, Para Post. For the analysis of fracture resistance, 40 bovine teeth were randomly assigned to the four groups (n=10 and used for the fabrication of test specimens simulating the situation in the mouth. The test specimens were subjected to compressive strength testing until fracture in an EMIC universal testing machine. After fracture of the test specimens, their roots were sectioned and analyzed by microscopy. For the finite element method, specimens of the fracture resistance test were simulated by computer modeling to determine the stress distribution pattern in the post systems studied. RESULTS: The fracture test presented the following averages and standard deviation: G1 (45.63±8.77, G2 (49.98±7.08, G3 (43.84±5.52, G4 (47.61±7.23. Stress was homogenously distributed along the body of the intraradicular post in group 1, whereas high stress concentrations in certain regions were observed in the other groups. These stress concentrations in the body of the post induced the same stress concentration in root dentin. CONCLUSIONS: The experimental post (original and modified versions presented similar fracture resistance and better results in the stress analysis when compared with the commercial post systems tested (08/2008-PA/CEP.

  12. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  13. The Space Station Module Power Management and Distribution automation test bed

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  14. Simple Algorithms to Calculate Asymptotic Null Distributions of Robust Tests in Case-Control Genetic Association Studies in R

    Directory of Open Access Journals (Sweden)

    Wing Kam Fung

    2010-02-01

    Full Text Available The case-control study is an important design for testing association between genetic markers and a disease. The Cochran-Armitage trend test (CATT is one of the most commonly used statistics for the analysis of case-control genetic association studies. The asymptotically optimal CATT can be used when the underlying genetic model (mode of inheritance is known. However, for most complex diseases, the underlying genetic models are unknown. Thus, tests robust to genetic model misspecification are preferable to the model-dependant CATT. Two robust tests, MAX3 and the genetic model selection (GMS, were recently proposed. Their asymptotic null distributions are often obtained by Monte-Carlo simulations, because they either have not been fully studied or involve multiple integrations. In this article, we study how components of each robust statistic are correlated, and find a linear dependence among the components. Using this new finding, we propose simple algorithms to calculate asymptotic null distributions for MAX3 and GMS, which greatly reduce the computing intensity. Furthermore, we have developed the R package Rassoc implementing the proposed algorithms to calculate the empirical and asymptotic p values for MAX3 and GMS as well as other commonly used tests in case-control association studies. For illustration, Rassoc is applied to the analysis of case-control data of 17 most significant SNPs reported in four genome-wide association studies.

  15. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  17. Analysis of tecniques for measurement of the size distribution of solid particles

    Directory of Open Access Journals (Sweden)

    F. O. Arouca

    2005-03-01

    Full Text Available Determination of the size distribution of solid particles is fundamental for analysis of the performance several pieces of equipment used for solid-fluid separation. The main objective of this work is to compare the results obtained with two traditional methods for determination of the size grade distribution of powdery solids: the gamma-ray attenuation technique (GRAT and the LADEQ test tube technique. The effect of draining the suspension in the two techniques used was also analyzed. The GRAT can supply the particle size distribution of solids through the monitoring of solid concentration in experiments on batch settling of diluted suspensions. The results show that use of the peristaltic pump in the GRAT and the LADEQ methods produced a significant difference between the values obtained for the parameters of the particle size model.

  18. Is Middle-Upper Arm Circumference “normally” distributed? Secondary data analysis of 852 nutrition surveys

    Directory of Open Access Journals (Sweden)

    Severine Frison

    2016-05-01

    Full Text Available Abstract Background Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 % are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH and/or low Mid-Upper Arm Circumference (MUAC (since 2005. Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. Methods This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise “non-normal” distributions. Results The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 % distributions using the Shapiro–Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 % were skewed (D’Agostino test and 196 (36.8 % had a kurtosis different to the one observed in the normal distribution (Anscombe–Glynn test. Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 % showed high digit preference, 164 (30.8 % had a large design effect, and 204 (38.3 % a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were “normalised” and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating “normal” after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7

  19. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  20. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  1. Performance Analysis of Radial Distribution Systems with UPQC and D-STATCOM

    Science.gov (United States)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-08-01

    This paper presents an effective method for finding optimum location of unified power quality conditioner (UPQC) and distributed static compensator (D-STATCOM) in radial distribution system. The bus having the minimum losses is selected as the candidate bus for UPQC placement and the optimal location of D-STATCOM is found by power loss index (PLI) method. The PLI values of all the buses are calculated and the bus having the highest PLI value is the most favorable bus and thus selected as candidate bus for D-STATCOM placement. Main contribution of this paper are: (i) finding optimum location of UPQC in radial distribution system (RDS) based on minimum power loss; (ii) finding the optimal size of UPQC which offers minimum losses; (iii) calculation of annual energy saving using UPQC and D-STATCOM; (iv) cost analysis with and without UPQC and D-STATCOM placement; and (v) comparison of results with and without UPQC and D-STATCOM placement in RDS. The algorithm is tested on IEEE 33-bus and 69-bus radial distribution systems by using MATLAB software.

  2. Impact of peak electricity demand in distribution grids: a stress test

    NARCIS (Netherlands)

    Hoogsteen, Gerwin; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria; Schuring, Friso; Kootstra, Ben

    2015-01-01

    The number of (hybrid) electric vehicles is growing, leading to a higher demand for electricity in distribution grids. To investigate the effects of the expected peak demand on distribution grids, a stress test with 15 electric vehicles in a single street is conducted and described in this paper.

  3. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  4. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  5. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  6. LEDA RF distribution system design and component test results

    International Nuclear Information System (INIS)

    Roybal, W.T.; Rees, D.E.; Borchert, H.L.; McCarthy, M.; Toole, L.

    1998-01-01

    The 350 MHz and 700 MHz RF distribution systems for the Low Energy Demonstration Accelerator (LEDA) have been designed and are currently being installed at Los Alamos National Laboratory. Since 350 MHz is a familiar frequency used at other accelerator facilities, most of the major high-power components were available. The 700 MHz, 1.0 MW, CW RF delivery system designed for LEDA is a new development. Therefore, high-power circulators, waterloads, phase shifters, switches, and harmonic filters had to be designed and built for this applications. The final Accelerator Production of Tritium (APT) RF distribution systems design will be based on much of the same technology as the LEDA systems and will have many of the RF components tested for LEDA incorporated into the design. Low power and high-power tests performed on various components of these LEDA systems and their results are presented here

  7. Thermomechanical analysis of the DFLL test blanket module for ITER

    International Nuclear Information System (INIS)

    Chen Hongli; Wu Yican; Bai Yunqing

    2006-01-01

    The finite element code is used to simulate two kinds of blanket design structure, which are SLL (Quasi-Static Lithium Lead) and DLL (Dual-cooled Lithium Lead) blanket concepts for the Dual Functional Lithium Lead-Test Blanket Module (DFLL-TBM) submitted to the ITER test blanket working group. The temperature and stress distributions have been presented for the two kinds of blanket structure on the basis of the structural design, thermal-hydraulic design and neutronics analysis. Also the mechanical performance is presented for the high temperature component of blanket structure according to the ITER Structural Design Criteria (ISDC). The rationality and feasibility of the two kinds of blanket structure design of DFLL-TBM have been analyzed based on the above results which also acted as the theoretical base for further optimized analysis. (authors)

  8. Independent test assessment using the extreme value distribution theory.

    Science.gov (United States)

    Almeida, Marcio; Blondell, Lucy; Peralta, Juan M; Kent, Jack W; Jun, Goo; Teslovich, Tanya M; Fuchsberger, Christian; Wood, Andrew R; Manning, Alisa K; Frayling, Timothy M; Cingolani, Pablo E; Sladek, Robert; Dyer, Thomas D; Abecasis, Goncalo; Duggirala, Ravindranath; Blangero, John

    2016-01-01

    The new generation of whole genome sequencing platforms offers great possibilities and challenges for dissecting the genetic basis of complex traits. With a very high number of sequence variants, a naïve multiple hypothesis threshold correction hinders the identification of reliable associations by the overreduction of statistical power. In this report, we examine 2 alternative approaches to improve the statistical power of a whole genome association study to detect reliable genetic associations. The approaches were tested using the Genetic Analysis Workshop 19 (GAW19) whole genome sequencing data. The first tested method estimates the real number of effective independent tests actually being performed in whole genome association project by the use of an extreme value distribution and a set of phenotype simulations. Given the familiar nature of the GAW19 data and the finite number of pedigree founders in the sample, the number of correlations between genotypes is greater than in a set of unrelated samples. Using our procedure, we estimate that the effective number represents only 15 % of the total number of independent tests performed. However, even using this corrected significance threshold, no genome-wide significant association could be detected for systolic and diastolic blood pressure traits. The second approach implements a biological relevance-driven hypothesis tested by exploiting prior computational predictions on the effect of nonsynonymous genetic variants detected in a whole genome sequencing association study. This guided testing approach was able to identify 2 promising single-nucleotide polymorphisms (SNPs), 1 for each trait, targeting biologically relevant genes that could help shed light on the genesis of the human hypertension. The first gene, PFH14 , associated with systolic blood pressure, interacts directly with genes involved in calcium-channel formation and the second gene, MAP4 , encodes a microtubule-associated protein and had already

  9. Test and analysis of thermal ratcheting deformation for 316L stainless steel cylindrical structure

    International Nuclear Information System (INIS)

    Lee, Hyeong Yeon; Kim, Jong Bum; Lee, Jae Han

    2002-01-01

    In this study, the progressive inelastic deformation, so called, thermal ratchet phenomenon which can occur in high temperature structures of liquid metal simulated with thermal ratchet structural test facility and 316L stainless steel test cylinder. The thermal ratchet deformation at the reactor baffle cylinder of the liquid metal reactor can occur due to the moving temperature distribution along the axial direction as the sodium free surface moves up and down under the cyclic heat-up and cool-down transients. The ratchet deformation was measured with the laser displacement sensor and LVDTs after cooling the structural specimen which is heated up to 550 degree C with steep temperature gradients along the axial direction. The temperature distribution of the test cylinder along the axial direction was measured with 28 channels of thermocouples and was used for the ratchet analysis. The thermal ratchet deformation was analyzed with the constitutive equation of nonlinear combined hardening model which was implemented as ABAQUS user subroutine and the analysis results were compared with those of the test. Thermal ratchet load was applied 9 times and the residual displacement after 9 cycles of thermal load was measured to be 1.79 mm. The ratcheting deformation shapes obtained by the analysis with the combined hardening model were in reasonable agreement with those of the structural tests

  10. A Generic Danish Distribution Grid Model for Smart Grid Technology Testing

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Wu, Qiuwei; Østergaard, Jacob

    2012-01-01

    This paper describes the development of a generic Danish distribution grid model for smart grid technology testing based on the Bornholm power system. The frequency dependent network equivalent (FDNE) method has been used in order to accurately preserve the desired properties and characteristics...... as a generic Smart Grid benchmark model for testing purposes....... by comparing the transient response of the original Bornholm power system model and the developed generic model under significant fault conditions. The results clearly show that the equivalent generic distribution grid model retains the dynamic characteristics of the original system, and can be used...

  11. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  12. Development of a web service for analysis in a distributed network.

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  13. Development of a Web Service for Analysis in a Distributed Network

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  14. Post-test analysis of ROSA-III experiment RUNs 705 and 706

    International Nuclear Information System (INIS)

    Koizumi, Yasuo; Soda, Kunihisa; Kikuchi, Osamu; Tasaka, Kanji; Shiba, Masayoshi

    1980-07-01

    The purpose of ROSA-III experiment with a scaled BWR Test facility is to examine primary coolant thermal-hydraulic behavior and performance of ECCS during a postulated loss-of-coolant accident of BWR. The results provide the information for verification and improvement of reactor safety analysis codes. RUNs 705 and 706 assumed a 200% double-ended break at the recirculation pump suction. RUN 705 was an isothermal blowdown test without initial power and initial core flow. In RUN 706 for an average core power and no ECCS, the main steam line and feed water line were isolated immediately on the break. Post-test analysis of RUNs 705 and 706 was made with computer code RELAP4J. The agreement in system pressure between calculation and experiment was satisfactory. However, the calculated heater rod surface temperature were significantly higher than the experimental ones. The calculated axial temperature profile was different in tendency from the experimental one. The calculated mixture level behavior in the core was different from the liquid void distribution observed in experiment. The rapid rise of fuel rod surface temperature was caused by the reduction of heat transfer coefficient attributed to the increase of quality. The need was indicated for improvement of analytical model of void distribution in the core, and also to performe a characteristic test of recirculation line under reverse flow and to examine the core inlet flow rate experimentally and analytically. (author)

  15. The Application of Hardware in the Loop Testing for Distributed Engine Control

    Science.gov (United States)

    Thomas, George L.; Culley, Dennis E.; Brand, Alex

    2016-01-01

    The essence of a distributed control system is the modular partitioning of control function across a hardware implementation. This type of control architecture requires embedding electronics in a multitude of control element nodes for the execution of those functions, and their integration as a unified system. As the field of distributed aeropropulsion control moves toward reality, questions about building and validating these systems remain. This paper focuses on the development of hardware-in-the-loop (HIL) test techniques for distributed aero engine control, and the application of HIL testing as it pertains to potential advanced engine control applications that may now be possible due to the intelligent capability embedded in the nodes.

  16. Boundary conditions on the Jointed Block Test: A two-dimensional and three-dimensional finite element analysis of stresses and temperatures

    International Nuclear Information System (INIS)

    Hardy, M.P.; Mitchell, S.J.

    1983-12-01

    This report presents the results from a numerical modeling study which was performed in support of the analysis of data from the Near-Surface Test Facility Block Test. The objective of the work was to investigate the potential for features of the test geometry and construction to influence the uniformity of the stress distribution across the test block and generate anomalous deformational response characteristics during loading. The analysis results indicated that the components of the test set-up can modify the imposed boundary conditions and affect the stress distribution in the block. However, the influence of these conditions was not sufficient to generate the anomalous conditions observed in actual field data. 5 refs

  17. Achievements of the ATLAS Distributed Analysis during the first run period

    CERN Document Server

    Farida, Fassi; The ATLAS collaboration

    2013-01-01

    Summary : In the LHC operations era analyzing the large data by the distributed physicists becomes a challenging task. The Computing Model of the ATLAS experiment at the LHC at CERN was designed around the concepts of grid computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with these challenges a global network known as the Worlwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. Since the start of data-taking, the ATLAS Distributed Analysis (ADA) service has been running stably with the huge amount of data. The reliability of the ADA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. The ATLAS Grid Computing Model is reviewed in this talk. Emphasis is given to ADA system. Description : The ce...

  18. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  19. Gene set analysis using variance component tests.

    Science.gov (United States)

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  20. Three-beam interferogram analysis method for surface flatness testing of glass plates and wedges

    Science.gov (United States)

    Sunderland, Zofia; Patorski, Krzysztof

    2015-09-01

    When testing transparent plates with high quality flat surfaces and a small angle between them the three-beam interference phenomenon is observed. Since the reference beam and the object beams reflected from both the front and back surface of a sample are detected, the recorded intensity distribution may be regarded as a sum of three fringe patterns. Images of that type cannot be succesfully analyzed with standard interferogram analysis methods. They contain, however, useful information on the tested plate surface flatness and its optical thickness variations. Several methods were elaborated to decode the plate parameters. Our technique represents a competitive solution which allows for retrieval of phase components of the three-beam interferogram. It requires recording two images: a three-beam interferogram and the two-beam one with the reference beam blocked. Mutually subtracting these images leads to the intensity distribution which, under some assumptions, provides access to the two component fringe sets which encode surfaces flatness. At various stages of processing we take advantage of nonlinear operations as well as single-frame interferogram analysis methods. Two-dimensional continuous wavelet transform (2D CWT) is used to separate a particular fringe family from the overall interferogram intensity distribution as well as to estimate the phase distribution from a pattern. We distinguish two processing paths depending on the relative density of fringe sets which is connected with geometry of a sample and optical setup. The proposed method is tested on simulated data.

  1. Non-regularized inversion method from light scattering applied to ferrofluid magnetization curves for magnetic size distribution analysis

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2014-01-01

    A numerical inversion method known from the analysis of light scattering by colloidal dispersions is now applied to magnetization curves of ferrofluids. The distribution of magnetic particle sizes or dipole moments is determined without assuming that the distribution is unimodal or of a particular shape. The inversion method enforces positive number densities via a non-negative least squares procedure. It is tested successfully on experimental and simulated data for ferrofluid samples with known multimodal size distributions. The created computer program MINORIM is made available on the web. - Highlights: • A method from light scattering is applied to analyze ferrofluid magnetization curves. • A magnetic size distribution is obtained without prior assumption of its shape. • The method is tested successfully on ferrofluids with a known size distribution. • The practical limits of the method are explored with simulated data including noise. • This method is implemented in the program MINORIM, freely available online

  2. Probabilistic analysis of glass elements with three-parameter Weibull distribution

    International Nuclear Information System (INIS)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-01-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  3. On-line test of power distribution prediction system for boiling water reactors

    International Nuclear Information System (INIS)

    Nishizawa, Y.; Kiguchi, T.; Kobayashi, S.; Takumi, K.; Tanaka, H.; Tsutsumi, R.; Yokomi, M.

    1982-01-01

    A power distribution prediction system for boiling water reactors has been developed and its on-line performance test has proceeded at an operating commercial reactor. This system predicts the power distribution or thermal margin in advance of control rod operations and core flow rate change. This system consists of an on-line computer system, an operator's console with a color cathode-ray tube, and plant data input devices. The main functions of this system are present power distribution monitoring, power distribution prediction, and power-up trajectory prediction. The calculation method is based on a simplified nuclear thermal-hydraulic calculation, which is combined with a method of model identification to the actual reactor core state. It has been ascertained by the on-line test that the predicted power distribution (readings of traversing in-core probe) agrees with the measured data within 6% root-mean-square. The computing time required for one prediction calculation step is less than or equal to 1.5 min by an HIDIC-80 on-line computer

  4. Testing nuclear parton distributions with pA collisions at the LHC

    CERN Document Server

    Quiroga-Arias, Paloma; Wiedemann, Urs Achim

    2010-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non-linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program at...

  5. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  6. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based "local" methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative "bucket-style" hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  7. On the state of acoustic emission analysis in pressure vessel and model vessel testing

    International Nuclear Information System (INIS)

    Morgner, W.; Theis, K.; Henke, F.; Imhof, D.

    1985-01-01

    In the GDR acoustic emission analysis is being applied primarily in connection with hydraulic pressure testing of vessels in chemical industry. It is, however, also used for testing and monitoring of equipment and components in other branches of industry. The state-of-the-art is presented with regard to equipment needed, training of personnel, licensing of testing methods and appropriate testing procedures. In particular, the evaluation of the sum curves and amplitude distributions is explained, using rupture tests of two oxygen cylinders and a compressed-air bottle as examples. (author)

  8. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M L; Palva, V [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K [ABB Corporate Research, Vaasa (Finland)

    1998-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  9. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M.L.; Palva, V. [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K. [ABB Corporate Research, Vaasa (Finland)

    1997-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  10. LDV measurement, flow visualization and numerical analysis of flow distribution in a close-coupled catalytic converter

    International Nuclear Information System (INIS)

    Kim, Duk Sang; Cho, Yong Seok

    2004-01-01

    Results from an experimental study of flow distribution in a Close-coupled Catalytic Converter (CCC) are presented. The experiments were carried out with a flow measurement system specially designed for this study under steady and transient flow conditions. A pitot tube was a tool for measuring flow distribution at the exit of the first monolith. The flow distribution of the CCC was also measured by LDV system and flow visualization. Results from numerical analysis are also presented. Experimental results showed that the flow uniformity index decreases as flow Reynolds number increases. In steady flow conditions, the flow through each exhaust pipe made some flow concentrations on a specific region of the CCC inlet. The transient test results showed that the flow through each exhaust pipe in the engine firing order, interacted with each other to ensure that the flow distribution was uniform. The results of numerical analysis were qualitatively accepted with experimental results. They supported and helped explain the flow in the entry region of CCC

  11. Static and Dynamic Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads for Smart Grids

    DEFF Research Database (Denmark)

    Mihet-Popa, Lucian; Groza, V.

    2011-01-01

    of the Smart Grids (SGs). A SG can operate interconnected to the main distribution grid or in islanded mode. This paper presents experimental tests for static and dynamic stability analysis carried out in a dedicated laboratory for research in distributed control and smart grid with a high share of renewable......The distributed energy resources (DER) contains several technologies, such as diesel engines, small wind turbines, photovoltaic inverters, etc. The control of DER components with storage devices and (controllable) loads, such as batteries, capacitors, dump loads, are central to the concept...... energy production. Moreover to point out, on a laboratory scale, the coupling between DR and storage and to effectively compensate wind fluctuations a number of tests have been done. In order to find out the parameters of various types of DER components for dynamic simulation models a number of tests...

  12. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  13. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  14. Development and Field Test of Voltage VAR Optimization in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-02-01

    Full Text Available This paper is a summary of the development and demonstration of an optimization program, voltage VAR optimization (VVO, in the Korean Smart Distribution Management System (KSDMS. KSDMS was developed to address the lack of receptivity of distributed generators (DGs, standardization and compatibility, and manual failure recovery in the existing Korean automated distribution system. Focusing on the lack of receptivity of DGs, we developed a real-time system analysis and control program. The KSDMS VVO enhances manual system operation of the existing distribution system and provides a solution with all control equipment operated at a system level. The developed VVO is an optimal power flow (OPF method that resolves violations, minimizes switching costs, and minimizes loss, and its function can vary depending on the operator’s command. The sequential mixed integer linear programming (SMILP method was adopted to find the solution of the OPF. We tested the precision of the proposed VVO on selected simulated systems and its applicability to actual systems at two substations on the Jeju Island. Running the KSDMS VVO on a regular basis improved system stability, and it also raised no issues regarding its applicability to actual systems.

  15. Percentiles of the null distribution of 2 maximum lod score tests.

    Science.gov (United States)

    Ulgen, Ayse; Yoo, Yun Joo; Gordon, Derek; Finch, Stephen J; Mendell, Nancy R

    2004-01-01

    We here consider the null distribution of the maximum lod score (LOD-M) obtained upon maximizing over transmission model parameters (penetrance values, dominance, and allele frequency) as well as the recombination fraction. Also considered is the lod score maximized over a fixed choice of genetic model parameters and recombination-fraction values set prior to the analysis (MMLS) as proposed by Hodge et al. The objective is to fit parametric distributions to MMLS and LOD-M. Our results are based on 3,600 simulations of samples of n = 100 nuclear families ascertained for having one affected member and at least one other sibling available for linkage analysis. Each null distribution is approximately a mixture p(2)(0) + (1 - p)(2)(v). The values of MMLS appear to fit the mixture 0.20(2)(0) + 0.80chi(2)(1.6). The mixture distribution 0.13(2)(0) + 0.87chi(2)(2.8). appears to describe the null distribution of LOD-M. From these results we derive a simple method for obtaining critical values of LOD-M and MMLS. Copyright 2004 S. Karger AG, Basel

  16. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  17. Optimal design of accelerated life tests for an extension of the exponential distribution

    International Nuclear Information System (INIS)

    Haghighi, Firoozeh

    2014-01-01

    Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed

  18. Performance and life time test on a 5 kW SOFC system for distributed cogeneration

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Rosa; De Biase, Sabrina; Ginocchio, Stefano [Edison S.p.A, Via Giorgio La Pira, 2, 10028 Trofarello (Italy); Bedogni, Stefano; Montelatici, Lorenzo [Edison S.p.A, Foro Bonaparte 31, 20121 Milano (Italy)

    2008-06-15

    Edison R and D Centre is committed to test a wide range of commercial and prototypal fuel cell systems. The activities aim to evaluate the available state of the art of these technologies and their maturity for the relevant market. The laboratory is equipped with ad hoc test benches designed to study single cells, stacks and systems. The characterization of commercial and new generation PEMFC, also for high temperatures (160 C), together with the analysis of the behaviour of SOFC represent the core activities of the laboratory. On January 2007 a new 5 kW SOFC system supplied by Acumentrics was installed. The claimed electrical power output is 5 kW and thermal power is 3 kW. The aim of the test is the achievement of technical and economical assessment for future applications of small SOFC plants for distributed cogeneration. Performance and life time test of the system are shown. (author)

  19. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  20. Dynamic stall characterization using modal analysis of phase-averaged pressure distributions

    Science.gov (United States)

    Harms, Tanner; Nikoueeyan, Pourya; Naughton, Jonathan

    2017-11-01

    Dynamic stall characterization by means of surface pressure measurements can simplify the time and cost associated with experimental investigation of unsteady airfoil aerodynamics. A unique test capability has been developed at University of Wyoming over the past few years that allows for time and cost efficient measurement of dynamic stall. A variety of rotorcraft and wind turbine airfoils have been tested under a variety of pitch oscillation conditions resulting in a range of dynamic stall behavior. Formation, development and separation of different flow structures are responsible for the complex aerodynamic loading behavior experienced during dynamic stall. These structures have unique signatures on the pressure distribution over the airfoil. This work investigates the statistical behavior of phase-averaged pressure distribution for different types of dynamic stall by means of modal analysis. The use of different modes to identify specific flow structures is being investigated. The use of these modes for different types of dynamic stall can provide a new approach for understanding and categorizing these flows. This work uses airfoil data acquired under Army contract W911W60160C-0021, DOE Grant DE-SC0001261, and a gift from BP Alternative Energy North America, Inc.

  1. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  2. Wind tunnel test IA300 analysis and results, volume 1

    Science.gov (United States)

    Kelley, P. B.; Beaufait, W. B.; Kitchens, L. L.; Pace, J. P.

    1987-01-01

    The analysis and interpretation of wind tunnel pressure data from the Space Shuttle wind tunnel test IA300 are presented. The primary objective of the test was to determine the effects of the Space Shuttle Main Engine (SSME) and the Solid Rocket Booster (SRB) plumes on the integrated vehicle forebody pressure distributions, the elevon hinge moments, and wing loads. The results of this test will be combined with flight test results to form a new data base to be employed in the IVBC-3 airloads analysis. A secondary objective was to obtain solid plume data for correlation with the results of gaseous plume tests. Data from the power level portion was used in conjunction with flight base pressures to evaluate nominal power levels to be used during the investigation of changes in model attitude, eleveon deflection, and nozzle gimbal angle. The plume induced aerodynamic loads were developed for the Space Shuttle bases and forebody areas. A computer code was developed to integrate the pressure data. Using simplified geometrical models of the Space Shuttle elements and components, the pressure data were integrated to develop plume induced force and moments coefficients that can be combined with a power-off data base to develop a power-on data base.

  3. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  4. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    Science.gov (United States)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  5. Development of Drop/Shock Test in Microelectronics and Impact Dynamic Analysis for Uniform Board Response

    Science.gov (United States)

    Kallolimath, Sharan Chandrashekar

    For the past several years, many researchers are constantly developing and improving board level drop test procedures and specifications to quantify the solder joint reliability performance of consumer electronics products. Predictive finite element analysis (FEA) by utilizing simulation software has become widely acceptable verification method which can reduce time and cost of the real-time test process. However, due to testing and metrological limitations it is difficult not only to simulate exact drop condition and capture critical measurement data but also tedious to calibrate the system to improve test methods. Moreover, some of the important ever changing factors such as board flexural rigidity, damping, drop height, and drop orientation results in non-uniform stress/strain distribution throughout the test board. In addition, one of the most challenging tasks is to quantify uniform stress and strain distribution throughout the test board and identify critical failure factors. The major contributions of this work are in the four aspects of the drop test in electronics as following. First of all, an analytical FEA model was developed to study the board natural frequencies and responses of the system with the consideration of dynamic stiffness, damping behavior of the material and effect of impact loading condition. An approach to find the key parameters that affect stress and strain distributions under predominate mode responses was proposed and verified with theoretical solutions. Input-G method was adopted to study board response behavior and cut boundary interpolation methods was used to analyze local model solder joint stresses with the development of global/local FEA model in ANSYS software. Second, no ring phenomenon during the drop test was identified theoretically when the test board was modeled as both discrete system and continuous system. Numerical analysis was then conducted by FEA method for detailed geometry of attached chips with solder

  6. Post-test analysis of PANDA test P4

    International Nuclear Information System (INIS)

    Hart, J.; Woudstra, A.; Koning, H.

    1999-01-01

    The results of a post-test analysis of the integral system test P4, which has been executed in the PANDA facility at PSI in Switzerland within the framework of Work Package 2 of the TEPSS project are presented. The post-test analysis comprises an evaluation of the PANDA test P4 and a comparison of the test results with the results of simulations using the RELAPS/MOD3.2, TRAC-BF1, and MELCOR 1.8.4 codes. The PANDA test P4 has provided data about how trapped air released from the drywell later in the transient affects PCCS performance in an adequate manner. The well-defined measurements can serve as an important database for the assessment of thermal hydraulic system analysis codes, especially for conditions that could be met in passively operated advanced reactors, i.e. low pressure and small driving forces. Based on the analysis of the test data, the test acceptance criteria have been met. The test P4 has been successfully completed and the instrument readings were with the permitted ranges. The PCCs showed a favorable and robust performance and a wide margin for decay heat removal from the containment. The PANDA P4 test demonstrated that trapped air, released from the drywell later in the transient, only temporarily and only slightly affected the performance of the passive containment cooling system. The analysis of the results of the RELAPS code showed that the overall behaviour of the test has been calculated quite well with regards to pressure, mass flow rates, and pool boil-down. This accounts both for the pre-test and the post-test simulations. However, due to the one-dimensional, stacked-volume modeling of the PANDA DW, WW, and GDCS vessels, 3D-effects such as in-vessel mixing and recirculation could not be calculated. The post-test MELCOR simulation showed an overall behaviour that is comparable to RELAPS. However, MELCOR calculated almost no air trapping in the PCC tubes that could hinder the steam condensation rate. This resulted in lower calculated

  7. Basic distribution free identification tests for small size samples of environmental data

    International Nuclear Information System (INIS)

    Federico, A.G.; Musmeci, F.

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data [it

  8. Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

    Science.gov (United States)

    Yoo, Paul

    2013-01-01

    Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

  9. A Test Generation Framework for Distributed Fault-Tolerant Algorithms

    Science.gov (United States)

    Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.

    2009-01-01

    Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.

  10. On the asymptotic distribution of a unit root test against ESTAR alternatives

    NARCIS (Netherlands)

    Hanck, Christoph

    We derive the null distribution of the nonlinear unit root test proposed in Kapetanios et al. [Kapetanios, G., Shin, Y., Snell, A., 2003. Testing for a unit root in the nonlinear STAR framework, journal of Econometrics 112, 359-379] when nonzero means or both means and deterministic trends are

  11. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  12. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  13. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  14. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  15. A distribution-free test for anomalous gamma-ray spectra

    International Nuclear Information System (INIS)

    Chan, Kung-sik; Li, Jinzheng; Eichinger, William; Bai, Er-Wei

    2014-01-01

    Gamma-ray spectra are increasingly acquired in monitoring cross-border traffic, or in an area search for lost or orphan special nuclear material (SNM). The signal in such data is generally weak, resulting in poorly resolved spectra, thereby making it hard to detect the presence of SNM. We develop a new test for detecting anomalous spectra by characterizing the complete shape change in a spectrum from background radiation; the proposed method may serve as a tripwire for routine screening for SNM. We show that, with increasing detection time, the limiting distribution of the test is given by some functional of the Brownian bridge. The efficacy of the proposed method is illustrated by simulations. - Highlights: • We develop a new non-parametric test for detecting anomalous gamma-ray spectra. • The proposed test has good empirical power for detecting weak signals. • It can serve as an effective tripwire for invoking more thorough scrutiny of the source

  16. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  17. Statistical analysis of COMPTEL maximum likelihood-ratio distributions: evidence for a signal from previously undetected AGN

    International Nuclear Information System (INIS)

    Williams, O. R.; Bennett, K.; Much, R.; Schoenfelder, V.; Blom, J. J.; Ryan, J.

    1997-01-01

    The maximum likelihood-ratio method is frequently used in COMPTEL analysis to determine the significance of a point source at a given location. In this paper we do not consider whether the likelihood-ratio at a particular location indicates a detection, but rather whether distributions of likelihood-ratios derived from many locations depart from that expected for source free data. We have constructed distributions of likelihood-ratios by reading values from standard COMPTEL maximum-likelihood ratio maps at positions corresponding to the locations of different categories of AGN. Distributions derived from the locations of Seyfert galaxies are indistinguishable, according to a Kolmogorov-Smirnov test, from those obtained from ''random'' locations, but differ slightly from those obtained from the locations of flat spectrum radio loud quasars, OVVs, and BL Lac objects. This difference is not due to known COMPTEL sources, since regions near these sources are excluded from the analysis. We suggest that it might arise from a number of sources with fluxes below the COMPTEL detection threshold

  18. To test or not to test

    DEFF Research Database (Denmark)

    Rochon, Justine; Gondan, Matthias; Kieser, Meinhard

    2012-01-01

    Background: Student's two-sample t test is generally used for comparing the means of two independent samples, for example, two treatment arms. Under the null hypothesis, the t test assumes that the two samples arise from the same normally distributed population with unknown variance. Adequate...... control of the Type I error requires that the normality assumption holds, which is often examined by means of a preliminary Shapiro-Wilk test. The following two-stage procedure is widely accepted: If the preliminary test for normality is not significant, the t test is used; if the preliminary test rejects...... the null hypothesis of normality, a nonparametric test is applied in the main analysis. Methods: Equally sized samples were drawn from exponential, uniform, and normal distributions. The two-sample t test was conducted if either both samples (Strategy I) or the collapsed set of residuals from both samples...

  19. ON ESTIMATION AND HYPOTHESIS TESTING OF THE GRAIN SIZE DISTRIBUTION BY THE SALTYKOV METHOD

    Directory of Open Access Journals (Sweden)

    Yuri Gulbin

    2011-05-01

    Full Text Available The paper considers the problem of validity of unfolding the grain size distribution with the back-substitution method. Due to the ill-conditioned nature of unfolding matrices, it is necessary to evaluate the accuracy and precision of parameter estimation and to verify the possibility of expected grain size distribution testing on the basis of intersection size histogram data. In order to review these questions, the computer modeling was used to compare size distributions obtained stereologically with those possessed by three-dimensional model aggregates of grains with a specified shape and random size. Results of simulations are reported and ways of improving the conventional stereological techniques are suggested. It is shown that new improvements in estimating and testing procedures enable grain size distributions to be unfolded more efficiently.

  20. Analysis of Radial Plutonium Isotope Distribution in Irradiated Test MOX Fuel Rods

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae Yong; Lee, Byung Ho; Koo, Yang Hyun; Kim, Han Soo

    2009-01-15

    After Rod 3 and 6 (KAERI MOX) were irradiated in the Halden reactor, their post-irradiation examinations are being carried out now. In this report, PLUTON code was implemented to analyze Rod 3 and 6 (KAERI MOX). In the both rods, the ratio of a maximum burnup to an average burnup in the radial distribution was 1.3 and the contents of {sup 239}Pu tended to increase as the radial position approached the periphery of the fuel pellet. The detailed radial distribution of {sup 239}Pu and {sup 240}Pu, however, were somewhat different. To find the reason for this difference, the contents of Pu isotopes were investigated as the burnup increased. The content of {sup 239}Pu decreased with the burnup. The content of {sup 240}Pu increased with the burnup by the 20 GWd/tM but decreased over the 20 GWd/tM. The local burnup of Rod 3 is higher than that of Rod 6 due to the hole penetrated through the fuel rod. The content of {sup 239}Pu decreased more rapidly than that of {sup 240}Pu in the Rod 6 with the increased burnup. It resulted in a radial distribution of {sup 239}Pu and {sup 240}Pu similar to Rod 3. The ratio of Xe to Kr is a parameter to find where the fissions occur in the nuclear fuel. In both Rod 3 and 6, it was 18.3 in the whole fuel rod cross section, which showed that the fissions occurred in the plutonium.

  1. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  2. Weighted Lomax distribution.

    Science.gov (United States)

    Kilany, N M

    2016-01-01

    The Lomax distribution (Pareto Type-II) is widely applicable in reliability and life testing problems in engineering as well as in survival analysis as an alternative distribution. In this paper, Weighted Lomax distribution is proposed and studied. The density function and its behavior, moments, hazard and survival functions, mean residual life and reversed failure rate, extreme values distributions and order statistics are derived and studied. The parameters of this distribution are estimated by the method of moments and the maximum likelihood estimation method and the observed information matrix is derived. Moreover, simulation schemes are derived. Finally, an application of the model to a real data set is presented and compared with some other well-known distributions.

  3. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  4. A Review of Power Distribution Test Feeders in the United States and the Need for Synthetic Representative Networks

    Directory of Open Access Journals (Sweden)

    Fernando E. Postigo Marcos

    2017-11-01

    Full Text Available Under the increasing penetration of distributed energy resources and new smart network technologies, distribution utilities face new challenges and opportunities to ensure reliable operations, manage service quality, and reduce operational and investment costs. Simultaneously, the research community is developing algorithms for advanced controls and distribution automation that can help to address some of these challenges. However, there is a shortage of realistic test systems that are publically available for development, testing, and evaluation of such new algorithms. Concerns around revealing critical infrastructure details and customer privacy have severely limited the number of actual networks published and that are available for testing. In recent decades, several distribution test feeders and US-featured representative networks have been published, but the scale, complexity, and control data vary widely. This paper presents a first-of-a-kind structured literature review of published distribution test networks with a special emphasis on classifying their main characteristics and identifying the types of studies for which they have been used. This both aids researchers in choosing suitable test networks for their needs and highlights the opportunities and directions for further test system development. In particular, we highlight the need for building large-scale synthetic networks to overcome the identified drawbacks of current distribution test feeders.

  5. Analysis of results obtained from field tracing test under natural rain condition

    International Nuclear Information System (INIS)

    Mukai, M.; Kamiyama, H.; Tanaka, T.; Wang Zhiming; Zhao Yingjie; Li Zhengtang

    1993-01-01

    As one of the tests arranged by the cooperative research between CIRP and JAERI, field tracing tests using 3 H, 60 Co, 85 Sr and 134 Cs were conducted in pits at the CIRP's field test site located on a loess tableland under natural rain condition. Precipitation amount and evaporation rate were measured to study complicated spatial-temporal behavior of soil water movement under that condition. The evaporation rate was obtained through an analysis on the measured data by a combined method of heat balance and eddy correlation. Numerical model, that is based on piston flow assumption of soil water movement, was developed and applied to determine the behavior of the soil water movement in the pits. Using the determined water movement, 3 H migration was evaluated by numerical simulation. Change of 3 H distribution as a function of elapsed time as well explained by careful evaluation of the soil water movement that carried out before the analysis. (5 figs.)

  6. Component evaluation testing and analysis algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, Darren M.; Merchant, Bion John

    2011-10-01

    The Ground-Based Monitoring R&E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. In order to guarantee consistency, traceability, and visibility into the results of the testing process, it is necessary to document the test and analysis procedures that are in place. Other reports document the testing procedures that are in place (Kromer, 2007). This document serves to provide a comprehensive overview of the analysis and the algorithms that are applied to the Component Evaluation testing. A brief summary of each test is included to provide the context for the analysis that is to be performed.

  7. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  8. Testing nuclear parton distributions with pA collisions at the TeV scale

    International Nuclear Information System (INIS)

    Quiroga-Arias, Paloma; Milhano, Jose Guilherme; Wiedemann, Urs Achim

    2010-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distribution functions (nPDFs) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of nonlinear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here, we argue that a proton-nucleus collision program at the Large Hadron Collider would provide a set of measurements, which allow for unprecedented tests of the factorization assumption, underlying global nPDF fits.

  9. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  10. Attitudes towards genetic testing: analysis of contradictions

    DEFF Research Database (Denmark)

    Jallinoja, P; Hakonen, A; Aro, A R

    1998-01-01

    A survey study was conducted among 1169 people to evaluate attitudes towards genetic testing in Finland. Here we present an analysis of the contradictions detected in people's attitudes towards genetic testing. This analysis focuses on the approval of genetic testing as an individual choice and o...... studies on attitudes towards genetic testing as well as in the health care context, e.g. in genetic counselling.......A survey study was conducted among 1169 people to evaluate attitudes towards genetic testing in Finland. Here we present an analysis of the contradictions detected in people's attitudes towards genetic testing. This analysis focuses on the approval of genetic testing as an individual choice...... and on the confidence in control of the process of genetic testing and its implications. Our analysis indicated that some of the respondents have contradictory attitudes towards genetic testing. It is proposed that contradictory attitudes towards genetic testing should be given greater significance both in scientific...

  11. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  12. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  13. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  14. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  15. A study on stress analysis of small punch-creep test and its experimental correlations with uniaxial-creep test

    International Nuclear Information System (INIS)

    Lee, Song In; Baek, Seoung Se; Kwon, Il Hyun; Yu, Hyo Sun

    2002-01-01

    A basic research was performed to ensure the usefulness of Small Punch-creep(SP-creep) test for residual life evaluation of heat resistant components effectively. This paper presents analytical results of initial stress and strain distributions in SP specimen caused by constant loading for SP-creep test and its experimental correlations with uniaxial creep(Ten-creep) test on 9CrlMoVNb steel. It was shown that the initial maximum equivalent stress, σ eq · max from FE analysis was correlated with steady-state equivalent creep strain rate, ε qf-ss , rupture time, t r , activation energy, Q and Larson-Miller parameter, LMP during SP-creep deformation. The simple correlation laws, σ SP - σ TEN , P SP -σ TEN and Q SP -Q TEN adopted to established a quantitative correlation between SP-creep and Ten-creep test data. Especially, the activation energy obtained from SP-creep test is linearly related to that from Ten-creep test at 650 deg. C as follows : Q SP-P =1.37 Q TEN , Q SP-σ =1.53 Q TEN

  16. Analysis of calculating methods for failure distribution function based on maximal entropy principle

    International Nuclear Information System (INIS)

    Guo Chunying; Lin Yuangen; Jiang Meng; Wu Changli

    2009-01-01

    The computation of invalidation distribution functions of electronic devices when exposed in gamma rays is discussed here. First, the possible devices failure distribution models are determined through the tests of statistical hypotheses using the test data. The results show that: the devices' failure distribution can obey multi-distributions when the test data is few. In order to decide the optimum failure distribution model, the maximal entropy principle is used and the elementary failure models are determined. Then, the Bootstrap estimation method is used to simulate the intervals estimation of the mean and the standard deviation. On the basis of this, the maximal entropy principle is used again and the simulated annealing method is applied to find the optimum values of the mean and the standard deviation. Accordingly, the electronic devices' optimum failure distributions are finally determined and the survival probabilities are calculated. (authors)

  17. Nevada test site radionuclide inventory and distribution: project operations plan

    International Nuclear Information System (INIS)

    Kordas, J.F.; Anspaugh, L.R.

    1982-01-01

    This document is the operational plan for conducting the Radionuclide Inventory and Distribution Program (RIDP) at the Nevada Test Site (NTS). The basic objective of this program is to inventory the significant radionuclides of NTS origin in NTS surface soil. The expected duration of the program is five years. This plan includes the program objectives, methods, organization, and schedules

  18. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  19. COBRA/TRAC analysis of two-dimensional thermal-hydraulic behavior in SCTF reflood tests

    International Nuclear Information System (INIS)

    Iwamura, Takamichi; Ohnuki, Akira; Sobajima, Makoto; Adachi, Hiromichi

    1987-01-01

    The effects of radial power distribution and non-uniform upper plenum water accumulation on thermal-hydraulic behavior in the core were observed in the reflood tests with Slab Core Test Facility (SCTF). In order to examine the predictability of these two effects by a multi-dimensional analysis code, the COBRA/TRAC calculations were made. The calculated results indicated that the heat transfer enhancement in high power bundles above quench front was caused by high vapor flow rate in those bundles due to the radial power distribution. On the other hand, the heat transfer degradation in the peripheral bundles under the condition of non-uniform upper plenum water accumulation was caused by the lower flow rates of vapor and entrained liquid above the quench front in those bundles by the reason that vapor concentrated in the center bundles due to the cross flow induced by the horizontal pressure gradient in the core. The above-mentioned two-dimensional heat transfer behaviors calculated with the COBRA/TRAC code is similar to those observed in SCTF tests and therefore those calculations are useful to investigate the mechanism of the two-dimensional effects in SCTF reflood tests. (author)

  20. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  1. AZ-101 Mixer Pump Demonstration and Tests Data Management Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    DOUGLAS, D.G.

    2000-02-22

    This document provides a plan for the analysis of the data collected during the AZ-101 Mixer Pump Demonstration and Tests. This document was prepared after a review of the AZ-101 Mixer Pump Test Plan (Revision 4) [1] and other materials. The plan emphasizes a structured and well-ordered approach towards handling and examining the data. This plan presumes that the data will be collected and organized into a unified body of data, well annotated and bearing the date and time of each record. The analysis of this data will follow a methodical series of steps that are focused on well-defined objectives. Section 2 of this plan describes how the data analysis will proceed from the real-time monitoring of some of the key sensor data to the final analysis of the three-dimensional distribution of suspended solids. This section also identifies the various sensors or sensor systems and associates them with the various functions they serve during the test program. Section 3 provides an overview of the objectives of the AZ-101 test program and describes the data that will be analyzed to support that test. The objectives are: (1) to demonstrate that the mixer pumps can be operated within the operating requirements; (2) to demonstrate that the mixer pumps can mobilize the sludge in sufficient quantities to provide feed to the private contractor facility, and (3) to determine if the in-tank instrumentation is sufficient to monitor sludge mobilization and mixer pump operation. Section 3 also describes the interim analysis that organizes the data during the test, so the analysis can be more readily accomplished. Section 4 describes the spatial orientation of the various sensors in the tank. This section is useful in visualizing the relationship of the Sensors in terms of their location in the tank and how the data from these sensors may be related to the data from other sensors. Section 5 provides a summary of the various analyses that will be performed on the data during the test

  2. AZ-101 Mixer Pump Demonstration and Tests: Data Management (Analysis) Plan

    International Nuclear Information System (INIS)

    DOUGLAS, D.G.

    2000-01-01

    This document provides a plan for the analysis of the data collected during the AZ-101 Mixer Pump Demonstration and Tests. This document was prepared after a review of the AZ-101 Mixer Pump Test Plan (Revision 4) [1] and other materials. The plan emphasizes a structured and well-ordered approach towards handling and examining the data. This plan presumes that the data will be collected and organized into a unified body of data, well annotated and bearing the date and time of each record. The analysis of this data will follow a methodical series of steps that are focused on well-defined objectives. Section 2 of this plan describes how the data analysis will proceed from the real-time monitoring of some of the key sensor data to the final analysis of the three-dimensional distribution of suspended solids. This section also identifies the various sensors or sensor systems and associates them with the various functions they serve during the test program. Section 3 provides an overview of the objectives of the AZ-101 test program and describes the data that will be analyzed to support that test. The objectives are: (1) to demonstrate that the mixer pumps can be operated within the operating requirements; (2) to demonstrate that the mixer pumps can mobilize the sludge in sufficient quantities to provide feed to the private contractor facility, and (3) to determine if the in-tank instrumentation is sufficient to monitor sludge mobilization and mixer pump operation. Section 3 also describes the interim analysis that organizes the data during the test, so the analysis can be more readily accomplished. Section 4 describes the spatial orientation of the various sensors in the tank. This section is useful in visualizing the relationship of the Sensors in terms of their location in the tank and how the data from these sensors may be related to the data from other sensors. Section 5 provides a summary of the various analyses that will be performed on the data during the test

  3. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  4. A statistical analysis of angular distribution of neutrino events observed in Kamiokande II and IMB detectors from supernova SN 1987 A

    International Nuclear Information System (INIS)

    Krivoruchenko, M.I.

    1989-01-01

    A detailed statistical analysis of angular distribution of neutrino events observed in Kamiokande II and IMB detectors on UT 07:35, 2/23'87 is carried out. Distribution functions of the mean scattering angles in the reaction anti υ e p→e + n and υe→υe are constructed with account taken of the multiple Coulomb scattering and the experimental angular errors. The Smirnov and Wald-Wolfowitz run tests are used to test the hypothesis that the angular distributions of events from the two detectors agree with each other. We test with the use of the Kolmogorov and Mises statistical criterions the hypothesis that the recorded events all represent anti υ e p→e + n inelastic scatterings. Then the Neyman-Pearson test is applied to each event in testing the hypothesis anti υ e p→e + n against the alternative υe→υe. The hypotheses that the number of elastic events equals s=0, 1, 2, ... against the alternatives s≠0, 1, 2, ... are tested on the basis of the generalized likelihood ratio criterion. The confidence intervals for the number of elastic events are also constructed. The current supernova models fail to give a satisfactory account of the angular distribution data. (orig.)

  5. Analysis of the percentage voids of test and field specimens using computerized tomography

    International Nuclear Information System (INIS)

    Braz, D.; Lopes, R.T.; Motta, L.M.G. da

    1999-01-01

    Computerized tomography has been an excellent tool of analysis of asphaltics mixtures, because it allows comparison of the quality and integrity of test and field specimens. It was required to detect and follow the evolution of cracks, when these mixtures were submitted to fatigue tests, and also helping to interpret the distribution of tensions and deformations which occur in the several types of solicitations imposed to the mixtures. Comparing the medium values of percentage voids obtained from tomographic images with the project's values, it can be observed that the values of test and field specimens for the wearing course are closer to the ones of the project than the ones of the binder. It can be verified that the wearing course specimens always present a distribution of the aggregate, and voids quite homogeneously in the whole profile of the sample, while the binder specimens show an accentuated differentiation of the same factors in the several heights of the sample. Therefore, when choosing a slice for tomography, these considerations should be taken into account

  6. Distributed training, testing, and decision aids within one solution

    Science.gov (United States)

    Strini, Robert A.; Strini, Keith

    2002-07-01

    Military air operations in the European theater require U.S. and NATO participants to send various mission experts to 10 Combined Air Operations Centers (CAOCs). Little or no training occurs prior to their arrival for tours of duty ranging between 90 days to 3 years. When training does occur, there is little assessment of its effectiveness in raising CAOC mission readiness. A comprehensive training management system has been developed that utilizes traditional and web based distance-learning methods for providing instruction and task practice as well as distributed simulation to provide mission rehearsal training opportunities on demand for the C2 warrior. This system incorporates new technologies, such as voice interaction and virtual tutors, and a Learning Management System (LMS) that tracks trainee progress from academic learning through procedural practice and mission training exercises. Supervisors can monitor their subordinate's progress through synchronous or asynchronous methods. Embedded within this system are virtual tutors, which provide automated performance measurement as well as tutoring. The training system offers a true time management savings for current instructors and training providers that today must perform On the Job Training (OJT) duties before, during and after each event. Many units do not have the resources to support OJT and are forced to maintain an overlap of several days to minimally maintain unit readiness. One CAOC Commander affected by this paradigm has advocated supporting a beta version of this system to test its ability to offer training on-demand and track the progress of its personnel and unit readiness. If successful, aircrew simulation devices can be connected through either Distributed Interactive Simulation or High Level Architecture methods to provide a DMT-C2 air operations training environment in Europe. This paper presents an approach to establishing a training, testing and decision aid capability and means to assess

  7. Structural test and analysis of a model of a BWR suppression chamber support in the plastic regime

    International Nuclear Information System (INIS)

    Blumer, U.R.; Klaeui, E.; Bosshard, E.P.

    1991-01-01

    A BWR Mark I suppression pool support has been analysed and tested in the laboratory. The aim was the demonstration of the acceptability of hypothetical dynamic loadings resulting from simultaneous steam blowdown through all safety relief valves. The analysis has shown that plastic deformation will locally occur, which is difficult to assess purely theoretical. Therefore tests in reduced scale were performed that show the amount and distribution of plastic flow in the supports. The paper describes the elastic analysis, the theory of the scaling laws for the reduced scale test, the test and its results. It also shows the thermographical method that has been used to determine the plastic material flow in the support structure. (author)

  8. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  9. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-12-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  10. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  11. Cryogenic distribution system for ITER proto-type cryoline test

    International Nuclear Information System (INIS)

    Bhattacharya, R.; Shah, N.; Badgujar, S.; Sarkar, B.

    2012-01-01

    Design validation for ITER cryoline will be carried out by proto-type test on cryoline. The major objectives of the test will be to ensure the mechanical integrity, reliability, thermal stress and heat load as well as checking of assembly and fabrication procedures. The cryogenics system has to satisfy the functional operating scenario of the cryoline. Cryoplant, distribution box (DB) including liquid helium (LHe) tank constitute the cryogenic system for the test. Conceptual system architecture is proposed with a commercially available refrigerator/liquefier and custom designed DB housing cold compressor, cold circulator as well as phase separator with sub-merged heat exchanger. System level optimization, mainly with DB and LHe tank with options, has been studied to minimize the cold power required for the system. Aspen HYSYS is used for the purpose of process simulation. The paper describes the system architecture and the optimized design as well as process simulation with associated results. (author)

  12. Test determination with tritium as a radioactive tracer of the residence time distribution in the stability pool for Cabrero sewage

    International Nuclear Information System (INIS)

    Diaz, Francisco; Duran, Oscar; Henriquez, Pedro; Vega, Pedro; Padilla, Liliana; Gonzalez, David; Garcia Agudo, Edmundo

    2000-01-01

    This work was prepared by the Chilean and International Atomic Energy Agencies and covers the hydrodynamic functioning of sewage stability pools using tracers. The plant selected in the city of Cabrero, 500 km. south of Santiago, and is a rectangular facultative pool with a surface area of 7100 m 2 and a maximum volume of 12,327 m2 that receives an average flow of 20 l/s, serving a population of 7000 individuals. The work aims to characterize the runoff from the flow that enters the pool, using a radioactive tracer test, where the incoming water is marked, and its out-coming passage is determined, to establish the residence time distribution. Tritium was selected in the form of tritiated water as a tracer that is precisely emptied into the water flow from the distribution ravine at the lake entrance. Samples are taken at the outflow to determine the concentration of tritium after distillation, simultaneously measuring the flow, to be analyzed in a liquid flicker counter. An average test time of 5.3 days was obtained and an analysis of the residence time distribution for the tracer shows that it leaves quickly and indicates bad flow distribution in the lake with a major short circuit and probable dead zones

  13. Analysis of Static Load Test of a Masonry Arch Bridge

    Science.gov (United States)

    Shi, Jing-xian; Fang, Tian-tian; Luo, Sheng

    2018-03-01

    In order to know whether the carrying capacity of the masonry arch bridge built in the 1980s on the shipping channel entering and coming out of the factory of a cement company can meet the current requirements of Level II Load of highway, through the equivalent load distribution of the test vehicle according to the current design specifications, this paper conducted the load test, evaluated the bearing capacity of the in-service stone arch bridge, and made theoretical analysis combined with Midas Civil. The results showed that under the most unfavorable load conditions the measured strain and deflection of the test sections were less than the calculated values, the bridge was in the elastic stage under the design load; the structural strength and stiffness of the bridge had a certain degree of prosperity, and under the in the current conditions of Level II load of highway, the bridge structure was in a safe state.

  14. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Test of methods for retrospective activity size distribution determination from filter samples

    International Nuclear Information System (INIS)

    Meisenberg, Oliver; Tschiersch, Jochen

    2015-01-01

    Determining the activity size distribution of radioactive aerosol particles requires sophisticated and heavy equipment, which makes measurements at large number of sites difficult and expensive. Therefore three methods for a retrospective determination of size distributions from aerosol filter samples in the laboratory were tested for their applicability. Extraction into a carrier liquid with subsequent nebulisation showed size distributions with a slight but correctable bias towards larger diameters compared with the original size distribution. Yields in the order of magnitude of 1% could be achieved. Sonication-assisted extraction into a carrier liquid caused a coagulation mode to appear in the size distribution. Sonication-assisted extraction into the air did not show acceptable results due to small yields. The method of extraction into a carrier liquid without sonication was applied to aerosol samples from Chernobyl in order to calculate inhalation dose coefficients for 137 Cs based on the individual size distribution. The effective dose coefficient is about half of that calculated with a default reference size distribution. - Highlights: • Activity size distributions can be recovered after aerosol sampling on filters. • Extraction into a carrier liquid and subsequent nebulisation is appropriate. • This facilitates the determination of activity size distributions for individuals. • Size distributions from this method can be used for individual dose coefficients. • Dose coefficients were calculated for the workers at the new Chernobyl shelter

  16. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  17. Dynamics of railway bridges, analysis and verification by field tests

    Directory of Open Access Journals (Sweden)

    Andersson Andreas

    2015-01-01

    Full Text Available The following paper discusses different aspects of railway bridge dynamics, comprising analysis, modelling procedures and experimental testing. The importance of realistic models is discussed, especially regarding boundary conditions, load distribution and soil-structure interaction. Two theoretical case studies are presented, involving both deterministic and probabilistic assessment of a large number of railway bridges using simplified and computationally efficient models. A total of four experimental case studies are also introduced, illustrating different aspects and phenomena in bridge dynamics. The excitation consists of both ambient vibrations, train induced vibrations, free vibrations after train passages and controlled forced excitation.

  18. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  19. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  20. Westinghouse-GOTHIC distributed parameter modelling for HDR test E11.2

    International Nuclear Information System (INIS)

    Narula, J.S.; Woodcock, J.

    1994-01-01

    The Westinghouse-GOTHIC (WGOTHIC) code is a sophisticated mathematical computer code designed specifically for the thermal hydraulic analysis of nuclear power plant containment and auxiliary buildings. The code is capable of sophisticated flow analysis via the solution of mass, momentum, and energy conservation equations. Westinghouse has investigated the use of subdivided noding to model the flow patterns of hydrogen following its release into a containment atmosphere. For the investigation, several simple models were constructed to represent a scale similar to the German HDR containment. The calculational models were simplified to test the basic capability of the plume modeling methods to predict stratification while minimizing the number of parameters. A large empty volume was modeled, with the same volume and height as HDR. A scenario was selected that would be expected to stably stratify, and the effects of noding on the prediction of stratification was studied. A single phase hot gas was injected into the volume at a height similar to that of HDR test E11.2, and there were no heat sinks modeled. Helium was released into the calculational models, and the resulting flow patterns were judged relative to the expected results. For each model, only the number of subdivisions within the containment volume was varied. The results of the investigation of noding schemes has provided evidence of the capability of subdivided (distributed parameter) noding. The results also showed that highly inaccurate flow patterns could be obtained by using an insufficient number of subdivided nodes. This presents a significant challenge to the containment analyst, who must weigh the benefits of increased noding with the penalties the noding may incur on computational efficiency. Clearly, however, an incorrect noding choice may yield erroneous results even if great care has been taken in modeling accurately all other characteristics of containments. (author). 9 refs., 9 figs

  1. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  2. SRM Internal Flow Tests and Computational Fluid Dynamic Analysis. Volume 2; CFD RSRM Full-Scale Analyses

    Science.gov (United States)

    2001-01-01

    This document presents the full-scale analyses of the CFD RSRM. The RSRM model was developed with a 20 second burn time. The following are presented as part of the full-scale analyses: (1) RSRM embedded inclusion analysis; (2) RSRM igniter nozzle design analysis; (3) Nozzle Joint 4 erosion anomaly; (4) RSRM full motor port slag accumulation analysis; (5) RSRM motor analysis of two-phase flow in the aft segment/submerged nozzle region; (6) Completion of 3-D Analysis of the hot air nozzle manifold; (7) Bates Motor distributed combustion test case; and (8) Three Dimensional Polysulfide Bump Analysis.

  3. Measurement of distribution coefficients using a radial injection dual-tracer test

    International Nuclear Information System (INIS)

    Pickens, J.F.; Jackson, R.E.; Inch, K.J.; Merritt, W.F.

    1981-01-01

    The dispersive and adsorptive properties of a sandy aquifer were evaluated by using a radial injection dual-tracer test with 131 I as the nonreactive tracer and 85 Sr as the reactive tracer. The tracer migration was monitored by using multilevel point-sampling devices located at various radial distances and depths. Nonequilibrium physical and chemical adsorption effects for 85 Sr were treated as a spreading or dispersion mechanism in the breakthrough curve analysis. The resulting effective dispersivity values for 85 Sr were typically a factor of 2 to 5 larger than those obtained for 131 I. The distribution coefficient (K/sub d//sup Sr/) values obtained from analysis of the breakthrough curves at three depths and two radial distances ranged from 2.6 to 4.5 ml/g. These compare favorably with values obtained by separation of fluids from solids in sediment cores, by batch experiments on core sediments and by analysis of a 25-year-old radioactive waste plume in another part of the same aquifer. Correlations of adsorbed 85 Sr radioactivity with grain size fractions demonstrated preferential adsorption to the coarsest fraction and to the finest fraction. The relative amounts of electrostatically and specifically adsorbed 85 Sr on the aquifer sediments were determined with desorption experiments on core sediments using selective chemical extractants. The withdrawal phase breakthrough curves for the well, obtained immediately following the injection phase, showed essentially full tracer recoveries for both 131 I and 85 Sr. Relatively slow desorption of 85 Sr provided further indication of the nonequilibrium nature of the adsorption-desorption phenomena

  4. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  5. Standard test method for distribution coefficients of inorganic species by the batch method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the determination of distribution coefficients of chemical species to quantify uptake onto solid materials by a batch sorption technique. It is a laboratory method primarily intended to assess sorption of dissolved ionic species subject to migration through pores and interstices of site specific geomedia. It may also be applied to other materials such as manufactured adsorption media and construction materials. Application of the results to long-term field behavior is not addressed in this method. Distribution coefficients for radionuclides in selected geomedia are commonly determined for the purpose of assessing potential migratory behavior of contaminants in the subsurface of contaminated sites and waste disposal facilities. This test method is also applicable to studies for parametric studies of the variables and mechanisms which contribute to the measured distribution coefficient. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement a...

  6. Experimental strain modal analysis for beam-like structure by using distributed fiber optics and its damage detection

    Science.gov (United States)

    Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo

    2017-07-01

    Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.

  7. Development and Field Test of a Real-Time Database in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-03-01

    Full Text Available Recently, a distribution management system (DMS that can conduct periodical system analysis and control by mounting various applications programs has been actively developed. In this paper, we summarize the development and demonstration of a database structure that can perform real-time system analysis and control of the Korean smart distribution management system (KSDMS. The developed database structure consists of a common information model (CIM-based off-line database (DB, a physical DB (PDB for DB establishment of the operating server, a real-time DB (RTDB for real-time server operation and remote terminal unit data interconnection, and an application common model (ACM DB for running application programs. The ACM DB for real-time system analysis and control of the application programs was developed by using a parallel table structure and a link list model, thereby providing fast input and output as well as high execution speed of application programs. Furthermore, the ACM DB was configured with hierarchical and non-hierarchical data models to reflect the system models that increase the DB size and operation speed through the reduction of the system, of which elements were unnecessary for analysis and control. The proposed database model was implemented and tested at the Gochaing and Jeju offices using a real system. Through data measurement of the remote terminal units, and through the operation and control of the application programs using the measurement, the performance, speed, and integrity of the proposed database model were validated, thereby demonstrating that this model can be applied to real systems.

  8. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  9. Testing collinear factorization and nuclear parton distributions with pA collisions at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Quiroga-Arias, Paloma [Departamento de Fisica de PartIculas and IGFAE, Universidade de Santiago de Compostela 15706 Santiago de Compostela (Spain); Milhano, Jose Guilherme [CENTRA, Departamento de Fisica, Instituto Superior Tecnico (IST), Av. Rovisco Pais 1, P-1049-001 Lisboa (Portugal); Wiedemann, Urs Achim, E-mail: pquiroga@fpaxpl.usc.es [Physics Department, Theory Unit, CERN, CH-1211 Geneve 23 (Switzerland)

    2011-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non- linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program at the LHC would provide a set of measurements allowing for unprecedented tests of the factorization assumption underlying global nPDF fits.

  10. A new technique for testing distribution of knowledge and to estimate sampling sufficiency in ethnobiology studies.

    Science.gov (United States)

    Araújo, Thiago Antonio Sousa; Almeida, Alyson Luiz Santos; Melo, Joabe Gomes; Medeiros, Maria Franco Trindade; Ramos, Marcelo Alves; Silva, Rafael Ricardo Vasconcelos; Almeida, Cecília Fátima Castelo Branco Rangel; Albuquerque, Ulysses Paulino

    2012-03-15

    We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data.

  11. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  12. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  13. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  14. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  15. Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shoman, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generate performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.

  16. Post-test analysis for the MIDAS DVI tests using MARS

    International Nuclear Information System (INIS)

    Bae, K. H.; Lee, Y. J.; Kwon, T. S.; Lee, W. J.; Kim, H. C.

    2002-01-01

    Various DVI tests have been performed at MIDAS test facility which is a scaled facility of APR1400 applying a modified linear scale ratio. The evaluation results for the various void height tests and direct bypass tests using a multi-dimensional best-estimate analysis code MARS, show that; (a) MARS code has an advanced modeling capability of well predicting major multi-dimensional thermal-hydraulic phenomena occurring in the downcomer, (b) MARS code under-predicts the steam condensation rates, which in turn causes to over-predict the ECC bypass rates. However, the trend of decrease in steam condensation rate and increase in ECC bypass rate in accordance with the increase in steam flow rate, and the calculation results of the ECC bypass rates under the EM analysis conditions generally agree with the test data

  17. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    Science.gov (United States)

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  18. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  19. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  20. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  1. Item Analysis in Introductory Economics Testing.

    Science.gov (United States)

    Tinari, Frank D.

    1979-01-01

    Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)

  2. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  3. Analysis of pin removal experiments conducted in an SCWR-like test lattice

    Energy Technology Data Exchange (ETDEWEB)

    Chawla, R. [Paul Scherrer Institue, CH-5232 Villigen PSI (Switzerland); Swiss Federal Inst. of Technology EPFL, CH-1015 Lausanne (Switzerland); Raetz, D. [Paul Scherrer Institue, CH-5232 Villigen PSI (Switzerland); Resun AG, CH-5001 Aarau (Switzerland); Jordan, K. A. [Paul Scherrer Institue, CH-5232 Villigen PSI (Switzerland); Univ. of Florida, Gainesville, FL (United States); Perret, G. [Paul Scherrer Institue, CH-5232 Villigen PSI (Switzerland)

    2012-07-01

    A comprehensive program of integral experiments, largely based on the measurement of reaction rate distributions, was carried out recently on an SCWR-like fuel lattice in the central test zone of the PROTEUS zero-power research reactor at the Paul Scherrer Inst. in Switzerland. The present paper reports on the analysis of a complementary set of measurements, in which the reactivity effects of removing individual pins from the unperturbed, heterogeneously moderated reference lattice were investigated. It has been found that the detailed Monte Carlo modeling of the whole reactor using MCNPX is able - as in the case of the reaction rate distributions - to reproduce the experimental results for the pin removal worths within the achievable statistical accuracy. A comparison of reduced-geometry calculations between MCNPX and the deterministic LWR assembly code CASMO-4E has revealed certain discrepancies. On the basis of a reactivity decomposition analysis of the differences between the codes, it has been suggested that these could be due to CASMO-4E deficiencies in calculating the effect, upon pin removal, of the extra moderation in the neighboring fuel pins. (authors)

  4. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  5. RECONSTRUCTING REDSHIFT DISTRIBUTIONS WITH CROSS-CORRELATIONS: TESTS AND AN OPTIMIZED RECIPE

    International Nuclear Information System (INIS)

    Matthews, Daniel J.; Newman, Jeffrey A.

    2010-01-01

    Many of the cosmological tests to be performed by planned dark energy experiments will require extremely well-characterized photometric redshift measurements. Current estimates for cosmic shear are that the true mean redshift of the objects in each photo-z bin must be known to better than 0.002(1 + z), and the width of the bin must be known to ∼0.003(1 + z) if errors in cosmological measurements are not to be degraded significantly. A conventional approach is to calibrate these photometric redshifts with large sets of spectroscopic redshifts. However, at the depths probed by Stage III surveys (such as DES), let alone Stage IV (LSST, JDEM, and Euclid), existing large redshift samples have all been highly (25%-60%) incomplete, with a strong dependence of success rate on both redshift and galaxy properties. A powerful alternative approach is to exploit the clustering of galaxies to perform photometric redshift calibrations. Measuring the two-point angular cross-correlation between objects in some photometric redshift bin and objects with known spectroscopic redshift, as a function of the spectroscopic z, allows the true redshift distribution of a photometric sample to be reconstructed in detail, even if it includes objects too faint for spectroscopy or if spectroscopic samples are highly incomplete. We test this technique using mock DEEP2 Galaxy Redshift survey light cones constructed from the Millennium Simulation semi-analytic galaxy catalogs. From this realistic test, which incorporates the effects of galaxy bias evolution and cosmic variance, we find that the true redshift distribution of a photometric sample can, in fact, be determined accurately with cross-correlation techniques. We also compare the empirical error in the reconstruction of redshift distributions to previous analytic predictions, finding that additional components must be included in error budgets to match the simulation results. This extra error contribution is small for surveys that sample

  6. Automated analysis of pumping tests; Analise automatizada de testes de bombeamento

    Energy Technology Data Exchange (ETDEWEB)

    Sugahara, Luiz Alberto Nozaki

    1996-01-01

    An automated procedure for analysis of pumping test data performed in groundwater wells is described. A computer software was developed to be used under the Windows operational system. The software allows the choice of 3 mathematical models for representing the aquifer behavior, which are: Confined aquifer (Theis model); Leaky aquifer (Hantush model); unconfined aquifer (Boulton model). The analysis of pumping test data using the proper aquifer model, allows for the determination of the model parameters such as transmissivity, storage coefficient, leakage coefficient and delay index. The computer program can be used for the analysis of data obtained from both pumping tests, with one or more pumping rates, and recovery tests. In the multiple rate case, a de superposition procedure has been implemented in order to obtain the equivalent aquifer response for the first flow rate, which is used in obtaining an initial estimate of the model parameters. Such initial estimate is required in the non-linear regression analysis method. The solutions to the partial differential equations describing the aquifer behavior were obtained in Laplace space, followed by numerical inversion of the transformed solution using the Stehfest algorithm. The data analysis procedure is based on a non-linear regression method by matching the field data to the theoretical response of a selected aquifer model, for a given type of test. A least squared regression analysis method was implemented using either Gauss-Newton or Levenberg-Marquardt procedures for minimization of a objective function. The computer software can also be applied to multiple rate test data in order to determine the non-linear well coefficient, allowing for the computation of the well inflow performance curve. (author)

  7. An environmental testing facility for Space Station Freedom power management and distribution hardware

    Science.gov (United States)

    Jackola, Arthur S.; Hartjen, Gary L.

    1992-01-01

    The plans for a new test facility, including new environmental test systems, which are presently under construction, and the major environmental Test Support Equipment (TSE) used therein are addressed. This all-new Rocketdyne facility will perform space simulation environmental tests on Power Management and Distribution (PMAD) hardware to Space Station Freedom (SSF) at the Engineering Model, Qualification Model, and Flight Model levels of fidelity. Testing will include Random Vibration in three axes - Thermal Vacuum, Thermal Cycling and Thermal Burn-in - as well as numerous electrical functional tests. The facility is designed to support a relatively high throughput of hardware under test, while maintaining the high standards required for a man-rated space program.

  8. DYNAMIC SOFTWARE TESTING MODELS WITH PROBABILISTIC PARAMETERS FOR FAULT DETECTION AND ERLANG DISTRIBUTION FOR FAULT RESOLUTION DURATION

    Directory of Open Access Journals (Sweden)

    A. D. Khomonenko

    2016-07-01

    Full Text Available Subject of Research.Software reliability and test planning models are studied taking into account the probabilistic nature of error detection and discovering. Modeling of software testing enables to plan the resources and final quality at early stages of project execution. Methods. Two dynamic models of processes (strategies are suggested for software testing, using error detection probability for each software module. The Erlang distribution is used for arbitrary distribution approximation of fault resolution duration. The exponential distribution is used for approximation of fault resolution discovering. For each strategy, modified labeled graphs are built, along with differential equation systems and their numerical solutions. The latter makes it possible to compute probabilistic characteristics of the test processes and states: probability states, distribution functions for fault detection and elimination, mathematical expectations of random variables, amount of detected or fixed errors. Evaluation of Results. Probabilistic characteristics for software development projects were calculated using suggested models. The strategies have been compared by their quality indexes. Required debugging time to achieve the specified quality goals was calculated. The calculation results are used for time and resources planning for new projects. Practical Relevance. The proposed models give the possibility to use the reliability estimates for each individual module. The Erlang approximation removes restrictions on the use of arbitrary time distribution for fault resolution duration. It improves the accuracy of software test process modeling and helps to take into account the viability (power of the tests. With the use of these models we can search for ways to improve software reliability by generating tests which detect errors with the highest probability.

  9. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  10. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  11. Comparison of diffusion and transport theory analysis with experimental results in fast breeder test reactor

    International Nuclear Information System (INIS)

    Sathyabama, N.; Mohanakrishnan, P.; Lee, S.M.

    1994-01-01

    A systematic analysis has been performed by 3 dimensional diffusion and transport methods to calculate the measured control rod worths and subassembly wise power distribution in fast breeder test reactor. Geometry corrections (rectangular to hexagonal and diffusion to transport corrections are estimated for multiplication factors and control rod worths. Calculated control rod worths by diffusion and transport theory are nearly the same and 10% above measured values. Power distribution in the core periphery is over predicted (15%) by diffusion theory. But, this over prediction reduces to 8% by use of the S N method. (authors). 9 refs., 4 tabs., 3 fig

  12. Statistical models for the analysis of water distribution system pipe break data

    International Nuclear Information System (INIS)

    Yamijala, Shridhar; Guikema, Seth D.; Brumbelow, Kelly

    2009-01-01

    The deterioration of pipes leading to pipe breaks and leaks in urban water distribution systems is of concern to water utilities throughout the world. Pipe breaks and leaks may result in reduction in the water-carrying capacity of the pipes and contamination of water in the distribution systems. Water utilities incur large expenses in the replacement and rehabilitation of water mains, making it critical to evaluate the current and future condition of the system for maintenance decision-making. This paper compares different statistical regression models proposed in the literature for estimating the reliability of pipes in a water distribution system on the basis of short time histories. The goals of these models are to estimate the likelihood of pipe breaks in the future and determine the parameters that most affect the likelihood of pipe breaks. The data set used for the analysis comes from a major US city, and these data include approximately 85,000 pipe segments with nearly 2500 breaks from 2000 through 2005. The results show that the set of statistical models previously proposed for this problem do not provide good estimates with the test data set. However, logistic generalized linear models do provide good estimates of pipe reliability and can be useful for water utilities in planning pipe inspection and maintenance

  13. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  14. Basic distribution free identification tests for small size samples of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Federico, A.G.; Musmeci, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data. [Italiano] Nell`analisi di dati ambientali ricorre spesso il caso di dover sottoporre a test l`ipotesi di provenienza di due, o piu`, insiemi di dati dalla stessa popolazione. Tipicamente i dati disponibili sono pochi e spesso l`ipotesi di provenienza da distribuzioni normali non e` sostenibile. D`altra aprte la diffusione odierna di Personal Computer fornisce nuove possibili soluzioni basate sull`uso intensivo delle risorse della CPU. Il rapporto analizza il problema e presenta la possibilita` di utilizzo di due test non parametrici basati sulle proprieta` intrinseche di equiprobabilita` dei campioni. Il primo e` basato su una tecnica di ricampionamento esaustivo mentre il secondo su un approccio di tipo bootstrap. E` presentato un programma di semplice utilizzo e un caso di studio basato su dati di contaminazione di bambini a Chernobyl.

  15. Semen analysis and sperm function tests: How much to test?

    Directory of Open Access Journals (Sweden)

    S S Vasan

    2011-01-01

    Full Text Available Semen analysis as an integral part of infertility investigations is taken as a surrogate measure for male fecundity in clinical andrology, male fertility, and pregnancy risk assessments. Clearly, laboratory seminology is still very much in its infancy. In as much as the creation of a conventional semen profile will always represent the foundations of male fertility evaluation, the 5th edition of the World Health Organization (WHO manual is a definitive statement on how such assessments should be carried out and how the quality should be controlled. A major advance in this new edition of the WHO manual, resolving the most salient critique of previous editions, is the development of the first well-defined reference ranges for semen analysis based on the analysis of over 1900 recent fathers. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Sperm function testing is used to determine if the sperm have the biologic capacity to perform the tasks necessary to reach and fertilize ova and ultimately result in live births. A variety of tests are available to evaluate different aspects of these functions. To accurately use these functional assays, the clinician must understand what the tests measure, what the indications are for the assays, and how to interpret the results to direct further testing or patient management.

  16. Testing DARKexp against energy and density distributions of Millennium-II halos

    Energy Technology Data Exchange (ETDEWEB)

    Nolting, Chris; Williams, Liliya L.R. [School of Physics and Astronomy, University of Minnesota, 116 Church Street SE, Minneapolis, MN, 55454 (United States); Boylan-Kolchin, Michael [Department of Astronomy, The University of Texas at Austin, 2515 Speedway, Stop C1400, Austin, TX, 78712 (United States); Hjorth, Jens, E-mail: nolting@astro.umn.edu, E-mail: llrw@astro.umn.edu, E-mail: mbk@astro.as.utexas.edu, E-mail: jens@dark-cosmology.dk [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, Copenhagen, DK-2100 Denmark (Denmark)

    2016-09-01

    We test the DARKexp model for relaxed, self-gravitating, collisionless systems against equilibrium dark matter halos from the Millennium-II simulation. While limited tests of DARKexp against simulations and observations have been carried out elsewhere, this is the first time the testing is done with a large sample of simulated halos spanning a factor of ∼ 50 in mass, and using independent fits to density and energy distributions. We show that DARKexp, a one shape parameter family, provides very good fits to the shapes of density profiles, ρ( r ), and differential energy distributions, N ( E ), of individual simulated halos. The best fit shape parameter φ{sub 0} obtained from the two types of fits are correlated, though with scatter. Our most important conclusions come from ρ( r ) and N ( E ) that have been averaged over many halos. These show that the bulk of the deviations between DARKexp and individual Millennium-II halos come from halo-to-halo fluctuations, likely driven by substructure, and other density perturbations. The average ρ( r ) and N ( E ) are quite smooth and follow DARKexp very closely. The only deviation that remains after averaging is small, and located at most bound energies for N ( E ) and smallest radii for ρ( r ). Since the deviation is confined to 3–4 smoothing lengths, and is larger for low mass halos, it is likely due to numerical resolution effects.

  17. Testing collinear factorization and nuclear parton distributions with pA collisions at the LHC

    CERN Document Server

    Quiroga-Arias, Paloma; Wiedemann, Urs Achim

    2011-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non- linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program a...

  18. Selected hydraulic test analysis techniques for constant-rate discharge tests

    International Nuclear Information System (INIS)

    Spane, F.A. Jr.

    1993-03-01

    The constant-rate discharge test is the principal field method used in hydrogeologic investigations for characterizing the hydraulic properties of aquifers. To implement this test, the aquifer is stressed by withdrawing ground water from a well, by using a downhole pump. Discharge during the withdrawal period is regulated and maintained at a constant rate. Water-level response within the well is monitored during the active pumping phase (i.e., drawdown) and during the subsequent recovery phase following termination of pumping. The analysis of drawdown and recovery response within the stress well (and any monitored, nearby observation wells) provides a means for estimating the hydraulic properties of the tested aquifer, as well as discerning formational and nonformational flow conditions (e.g., wellbore storage, wellbore damage, presence of boundaries, etc.). Standard analytical methods that are used for constant-rate pumping tests include both log-log type-curve matching and semi-log straight-line methods. This report presents a current ''state of the art'' review of selected transient analysis procedures for constant-rate discharge tests. Specific topics examined include: analytical methods for constant-rate discharge tests conducted within confined and unconfined aquifers; effects of various nonideal formation factors (e.g., anisotropy, hydrologic boundaries) and well construction conditions (e.g., partial penetration, wellbore storage) on constant-rate test response; and the use of pressure derivatives in diagnostic analysis for the identification of specific formation, well construction, and boundary conditions

  19. Development of the test facilities for the measurement of core flow and pressure distribution of SMART reactor

    International Nuclear Information System (INIS)

    Ko, Y.J.; Euh, D.J.; Youn, Y.J.; Chu, I.C.; Kwon, T.S.

    2011-01-01

    A design of SMART reactor has been developed, of which the primary system is composed of four internal circulation pumps, a core of 57 fuel assemblies, eight cassettes of steam generators, flow mixing head assemblies, and other internal structures. Since primary design features are very different from conventional reactors, the characteristics of flow and pressure distribution are expected to be different accordingly. In order to analyze the thermal margin and hydraulic design characteristics of SMART reactor, design quantification tests for flow and pressure distribution with a preservation of flow geometry are necessary. In the present study, the design feature of the test facility in order to investigate flow and pressure distribution, named “SCOP” is described. In order to preserve the flow distribution characteristics, the SCOP is linearly reduced with a scaling ratio of 1/5. The core flow rate of each fuel assembly is measured by a venturi meter attached in the lower part of the core simulator having a similarity of pressure drop for nominally scaled flow conditions. All the 57 core simulators and 8 S/G simulators are precisely calibrated in advance of assembling in test facilities. The major parameters in tests are pressures, differential pressures, and core flow distribution. (author)

  20. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  1. Improvement of the CULTEX® exposure technology by radial distribution of the test aerosol.

    Science.gov (United States)

    Aufderheide, Michaela; Heller, Wolf-Dieter; Krischenowski, Olaf; Möhle, Niklas; Hochrainer, Dieter

    2017-07-05

    The exposure of cellular based systems cultivated on microporous membranes at the air-liquid interface (ALI) has been accepted as an appropriate approach to simulate the exposure of cells of the respiratory tract to native airborne substances. The efficiency of such an exposure procedure with regard to stability and reproducibility depends on the optimal design at the interface between the cellular test system and the exposure technique. The actual exposure systems favor the dynamic guidance of the airborne substances to the surface of the cells in specially designed exposure devices. Two module types, based on a linear or radial feed of the test atmosphere to the test system, were used for these studies. In our technical history, the development started with the linear designed version, the CULTEX ® glass modules, fulfilling basic requirements for running ALI exposure studies (Mohr and Durst, 2005). The instability in the distribution of different atmospheres to the cells caused us to create a new exposure module, characterized by a stable and reproducible radial guidance of the aerosol to the cells. The outcome was the CULTEX ® RFS (Mohr et al., 2010). In this study, we describe the differences between the two systems with regard to particle distribution and deposition clarifying the advantages and disadvantages of a radial to a linear aerosol distribution concept. Copyright © 2017 Elsevier GmbH. All rights reserved.

  2. Field test of a continuous-variable quantum key distribution prototype

    International Nuclear Information System (INIS)

    Fossier, S; Debuisschert, T; Diamanti, E; Villing, A; Tualle-Brouri, R; Grangier, P

    2009-01-01

    We have designed and realized a prototype that implements a continuous-variable quantum key distribution (QKD) protocol based on coherent states and reverse reconciliation. The system uses time and polarization multiplexing for optimal transmission and detection of the signal and phase reference, and employs sophisticated error-correction codes for reconciliation. The security of the system is guaranteed against general coherent eavesdropping attacks. The performance of the prototype was tested over preinstalled optical fibres as part of a quantum cryptography network combining different QKD technologies. The stable and automatic operation of the prototype over 57 h yielded an average secret key distribution rate of 8 kbit s -1 over a 3 dB loss optical fibre, including the key extraction process and all quantum and classical communication. This system is therefore ideal for securing communications in metropolitan size networks with high-speed requirements.

  3. Development, Demonstration, and Field Testing of Enterprise-Wide Distributed Generation Energy Management System: Phase 1 Report

    Energy Technology Data Exchange (ETDEWEB)

    2003-04-01

    This report describes RealEnergy's evolving distributed generation command and control system, called the"Distributed Energy Information System" (DEIS). This system uses algorithms to determine how to operate distributed generation systems efficiently and profitably. The report describes the system and RealEnergy's experiences in installing and applying the system to manage distributed generators for commercial building applications.The report is divided into six tasks. The first five describe the DEIS; the sixth describes RE's regulatory and contractual obligations: Task 1: Define Information and Communications Requirements; Task 2: Develop Command and Control Algorithms for Optimal Dispatch; Task 3: Develop Codes and Modules for Optimal Dispatch Algorithms; Task 4: Test Codes Using Simulated Data; Task 5: Install and Test Energy Management Software; Task 6: Contractual and Regulatory Issues.

  4. Analysis of Peach Bottom turbine trip tests

    International Nuclear Information System (INIS)

    Cheng, H.S.; Lu, M.S.; Hsu, C.J.; Shier, W.G.; Diamond, D.J.; Levine, M.M.; Odar, F.

    1979-01-01

    Current interest in the analysis of turbine trip transients has been generated by the recent tests performed at the Peach Bottom (Unit 2) reactor. Three tests, simulating turbine trip transients, were performed at different initial power and coolant flow conditions. The data from these tests provide considerable information to aid qualification of computer codes that are currently used in BWR design analysis. The results are presented of an analysis of a turbine trip transient using the RELAP-3B and the BNL-TWIGL computer codes. Specific results are provided comparing the calculated reactor power and system pressures with the test data. Excellent agreement for all three test transients is evident from the comparisons

  5. Analysis of intergranular fission-gas bubble-size distributions in irradiated uranium-molybdenum alloy fuel

    Energy Technology Data Exchange (ETDEWEB)

    Rest, J. [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)], E-mail: jrest@anl.gov; Hofman, G.L.; Kim, Yeon Soo [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)

    2009-04-15

    An analytical model for the nucleation and growth of intra and intergranular fission-gas bubbles is used to characterize fission-gas bubble development in low-enriched U-Mo alloy fuel irradiated in the advanced test reactor in Idaho as part of the Reduced Enrichment for Research and Test Reactor (RERTR) program. Fuel burnup was limited to less than {approx}7.8 at.% U in order to capture the fuel-swelling stage prior to irradiation-induced recrystallization. The model couples the calculation of the time evolution of the average intergranular bubble radius and number density to the calculation of the intergranular bubble-size distribution based on differential growth rate and sputtering coalescence processes. Recent results on TEM analysis of intragranular bubbles in U-Mo were used to set the irradiation-induced diffusivity and re-solution rate in the bubble-swelling model. Using these values, good agreement was obtained for intergranular bubble distribution compared against measured post-irradiation examination (PIE) data using grain-boundary diffusion enhancement factors of 15-125, depending on the Mo concentration. This range of enhancement factors is consistent with values obtained in the literature.

  6. Analysis of intergranular fission-gas bubble-size distributions in irradiated uranium-molybdenum alloy fuel

    Science.gov (United States)

    Rest, J.; Hofman, G. L.; Kim, Yeon Soo

    2009-04-01

    An analytical model for the nucleation and growth of intra and intergranular fission-gas bubbles is used to characterize fission-gas bubble development in low-enriched U-Mo alloy fuel irradiated in the advanced test reactor in Idaho as part of the Reduced Enrichment for Research and Test Reactor (RERTR) program. Fuel burnup was limited to less than ˜7.8 at.% U in order to capture the fuel-swelling stage prior to irradiation-induced recrystallization. The model couples the calculation of the time evolution of the average intergranular bubble radius and number density to the calculation of the intergranular bubble-size distribution based on differential growth rate and sputtering coalescence processes. Recent results on TEM analysis of intragranular bubbles in U-Mo were used to set the irradiation-induced diffusivity and re-solution rate in the bubble-swelling model. Using these values, good agreement was obtained for intergranular bubble distribution compared against measured post-irradiation examination (PIE) data using grain-boundary diffusion enhancement factors of 15-125, depending on the Mo concentration. This range of enhancement factors is consistent with values obtained in the literature.

  7. Privacy-preserving Kruskal-Wallis test.

    Science.gov (United States)

    Guo, Suxin; Zhong, Sheng; Zhang, Aidong

    2013-10-01

    Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Analysis of CCRL proficiency cements 151 and 152 using the Virtual Cement and Concrete Testing Laboratory

    International Nuclear Information System (INIS)

    Bullard, Jeffrey W.; Stutzman, Paul E.

    2006-01-01

    To test the ability of the Virtual Cement and Concrete Testing Laboratory (VCCTL) software to predict cement hydration properties, characterization of mineralogy and phase distribution is necessary. Compositional and textural characteristics of Cement and Concrete Reference Laboratory (CCRL) cements 151 and 152 were determined via scanning electron microscopy (SEM) analysis followed by computer modeling of hydration properties. The general procedure to evaluate a cement is as follows: (1) two-dimensional SEM backscattered electron and X-ray microanalysis images of the cement are obtained, along with a measured particle size distribution (PSD); (2) based on analysis of these images and the measured PSD, three-dimensional microstructures of various water-to-cement ratios are created and hydrated using VCCTL, and (3) the model predictions for degree of hydration under saturated conditions, heat of hydration (ASTM C186), setting time (ASTM C191), and strength development of mortar cubes (ASTM C109) are compared to experimental measurements either performed at NIST or at the participating CCRL proficiency sample evaluation laboratories. For both cements, generally good agreement is observed between the model predictions and the experimental data

  9. Pressure distribution data from tests of 2.29 M (7.5 feet) span EET high-lift transport aircraft model in the Ames 12-foot pressure tunnel

    Science.gov (United States)

    Kjelgaard, S. O.; Morgan, H. L., Jr.

    1983-01-01

    A high-lift transport aircraft model equipped with full-span leading-edge slat and part-span double-slotted trailing-edge flap was tested in the Ames 12-ft pressure tunnel to determine the low-speed performance characteristics of a representative high-aspect-ratio supercritical wing. These tests were performed in support of the Energy Efficient Transport (EET) program which is one element of the Aircraft Energy Efficiency (ACEE) project. Static longitudinal forces and moments and chordwise pressure distributions at three spanwise stations were measured for cruise, climb, two take-off flap, and two landing flap wing configurations. The tabulated and plotted pressure distribution data is presented without analysis or discussion.

  10. R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation

    Science.gov (United States)

    Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao

    2017-03-01

    Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.

  11. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  12. Spectrum analysis of radiotracer residence time distribution for industrial and environmental applications

    International Nuclear Information System (INIS)

    Kasban, H.; Ashraf Hamid

    2014-01-01

    Radiotracer signal analysis and recognition still represents challenges in industrial and environmental applications specially in residence time distribution (RTD) measurement. This paper presents a development for the RTD signal recognition method that is based on power density spectrum (PDS). In this development, the features are extracted from the signals and/or from their higher-orders statistics (HOS) (Bispectrum and Trispectrum) instead of PDS. The HOS are estimated using direct, indirect and parametric estimations. The recognition results are analyzed and compared for different HOS estimation in order to select the best HOS estimation method for the purpose of RTD signal recognition. The artificial neural networks are used for training and testing of the proposed method. The proposed method is tested using RTD signals obtained from the measurements carried out using radiotracer technique. The simulation results show that the parametric estimation of the Trispectrum gives the higher recognition rate and is the most reliable for the RTD signal recognition. (author)

  13. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  14. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  15. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  16. Testing of Military Towbars

    Science.gov (United States)

    2016-09-28

    Sieve analysis data are recorded using DD Form 1206 and the grain size distribution - aggregate grading chart. DD Form 1207 will be used to present...606 28 September 2016 6 b. Design load. The amount of load that the towbar was designed for as predicted by analysis . The design load is to be...test. Performance tests required for a complete towing analysis include the following: (1) Physical Characteristics (TOP 02-2-5004). (2

  17. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  18. Development of multi-dimensional analysis method for porous blockage in fuel subassembly. Numerical simulation for 4 subchannel geometry water test

    International Nuclear Information System (INIS)

    Tanaka, Masa-aki; Kamide, Hideki

    2001-02-01

    This investigation deals with the porous blockage in a wire spacer type fuel subassembly in Fast Breeder Reactors (FBR's). Multi-dimensional analysis method for a porous blockage in a fuel subassembly is developed using the standard k-ε turbulence model with the typical correlations in handbooks. The purpose of this analysis method is to evaluate the position and the magnitude of the maximum temperature, and to investigate the thermo-hydraulic phenomena in the porous blockage. Verification of this analysis method was conducted based on the results of 4-subchannel geometry water test. It was revealed that the evaluation of the porosity distribution and the particle diameter in a porous blockage was important to predict the temperature distribution. This analysis method could simulate the spatial characteristic of velocity and temperature distributions in the blockage and evaluate the pin surface temperature inside the porous blockage. Through the verification of this analysis method, it is shown that this multi-dimensional analysis method is useful to predict the thermo-hydraulic field and the highest temperature in a porous blockage. (author)

  19. Hypervelocity Impact Test Fragment Modeling: Modifications to the Fragment Rotation Analysis and Lightcurve Code

    Science.gov (United States)

    Gouge, Michael F.

    2011-01-01

    Hypervelocity impact tests on test satellites are performed by members of the orbital debris scientific community in order to understand and typify the on-orbit collision breakup process. By analysis of these test satellite fragments, the fragment size and mass distributions are derived and incorporated into various orbital debris models. These same fragments are currently being put to new use using emerging technologies. Digital models of these fragments are created using a laser scanner. A group of computer programs referred to as the Fragment Rotation Analysis and Lightcurve code uses these digital representations in a multitude of ways that describe, measure, and model on-orbit fragments and fragment behavior. The Dynamic Rotation subroutine generates all of the possible reflected intensities from a scanned fragment as if it were observed to rotate dynamically while in orbit about the Earth. This calls an additional subroutine that graphically displays the intensities and the resulting frequency of those intensities as a range of solar phase angles in a Probability Density Function plot. This document reports the additions and modifications to the subset of the Fragment Rotation Analysis and Lightcurve concerned with the Dynamic Rotation and Probability Density Function plotting subroutines.

  20. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  1. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  2. Optimal power flow for distribution networks with distributed generation

    Directory of Open Access Journals (Sweden)

    Radosavljević Jordan

    2015-01-01

    Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046

  3. Economic analysis of efficient distribution transformer trends

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  4. Analysis of discrete and continuous distributions of ventilatory time constants from dynamic computed tomography

    International Nuclear Information System (INIS)

    Doebrich, Marcus; Markstaller, Klaus; Karmrodt, Jens; Kauczor, Hans-Ulrich; Eberle, Balthasar; Weiler, Norbert; Thelen, Manfred; Schreiber, Wolfgang G

    2005-01-01

    In this study, an algorithm was developed to measure the distribution of pulmonary time constants (TCs) from dynamic computed tomography (CT) data sets during a sudden airway pressure step up. Simulations with synthetic data were performed to test the methodology as well as the influence of experimental noise. Furthermore the algorithm was applied to in vivo data. In five pigs sudden changes in airway pressure were imposed during dynamic CT acquisition in healthy lungs and in a saline lavage ARDS model. The fractional gas content in the imaged slice (FGC) was calculated by density measurements for each CT image. Temporal variations of the FGC were analysed assuming a model with a continuous distribution of exponentially decaying time constants. The simulations proved the feasibility of the method. The influence of experimental noise could be well evaluated. Analysis of the in vivo data showed that in healthy lungs ventilation processes can be more likely characterized by discrete TCs whereas in ARDS lungs continuous distributions of TCs are observed. The temporal behaviour of lung inflation and deflation can be characterized objectively using the described new methodology. This study indicates that continuous distributions of TCs reflect lung ventilation mechanics more accurately compared to discrete TCs

  5. Quality parameters analysis of optical imaging systems with enhanced focal depth using the Wigner distribution function

    Science.gov (United States)

    Zalvidea; Colautti; Sicre

    2000-05-01

    An analysis of the Strehl ratio and the optical transfer function as imaging quality parameters of optical elements with enhanced focal length is carried out by employing the Wigner distribution function. To this end, we use four different pupil functions: a full circular aperture, a hyper-Gaussian aperture, a quartic phase plate, and a logarithmic phase mask. A comparison is performed between the quality parameters and test images formed by these pupil functions at different defocus distances.

  6. MCNP(TM) Release 6.1.1 beta: Creating and Testing the Code Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Lawrence J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Casswell, Laura [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-06-12

    This report documents the preparations for and testing of the production release of MCNP6™1.1 beta through RSICC at ORNL. It addresses tests on supported operating systems (Linux, MacOSX, Windows) with the supported compilers (Intel, Portland Group and gfortran). Verification and Validation test results are documented elsewhere. This report does not address in detail the overall packaging of the distribution. Specifically, it does not address the nuclear and atomic data collection, the other included software packages (MCNP5, MCNPX and MCNP6) and the collection of reference documents.

  7. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    AlRashidi, M.R., E-mail: malrash2002@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait); AlHajri, M.F., E-mail: mfalhajri@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait)

    2011-10-15

    Highlights: {yields} A new hybrid PSO for optimal DGs placement and sizing. {yields} Statistical analysis to fine tune PSO parameters. {yields} Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  8. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    International Nuclear Information System (INIS)

    AlRashidi, M.R.; AlHajri, M.F.

    2011-01-01

    Highlights: → A new hybrid PSO for optimal DGs placement and sizing. → Statistical analysis to fine tune PSO parameters. → Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  9. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  10. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  11. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  12. Comment on the asymptotics of a distribution-free goodness of fit test statistic.

    Science.gov (United States)

    Browne, Michael W; Shapiro, Alexander

    2015-03-01

    In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.

  13. A statistical design for testing apomictic diversification through linkage analysis.

    Science.gov (United States)

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  14. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  15. Irradiation data analysis and thermal analysis of the 02M-02K capsule for material irradiation test

    International Nuclear Information System (INIS)

    Choi, Myoung Hwan; Choo, K. N.; Kang, Y. H.; Kim, B. G.; Cho, M. S.; Sohn, J. M.; Shin, Y. T.; Park, S. J.; Kim, Y. J.

    2004-11-01

    In order to evaluate the fracture toughness of RPV materials, the material irradiation test using the instrumented capsule (02M-02K) were carried out in the HANARO in August 2003. Based on the user's requirements the thermal design analysis of the capsule 02M-02K was performed, and the specimens were suitably arranged in each step of the capsule main body. In this report, both the temperature data of specimens measured during irradiation test and the calculated data from the thermal analysis are compared and evaluated. Also, the temperature profile in each step with the HANARO reactor power and helium pressure is reviewed and evaluated. The effects of the gap size such as theoretically calculated from thermal expansion during irradiation test and measured one in the manufacturing of the capsule on the specimen temperature were reviewed. The thermal analysis was performed by using a Finite Element (FE) analysis program, ANSYS. Two-dimensional model for the 1/4 section of the capsule is generated, and the γ-heating rate of the materials used in the capsule at the control rod position of 430 mm is used as input data. The thermal analysis using a 3-dimensional model, which is quite similar to the actual shape of the capsule, is also conducted to obtain the temperature distribution in the axial direction. The analysis results show that the temperature difference between the top and bottom positions of a specimen is found to be smaller than 13.2 .deg. C. The maximum measured and calculated temperature in the step 3 of the capsule is 256 .deg. C and 264 .deg. C, respectively. The measured temperature data are obtained at the reactor power of 24 MW, the heater power of 0 W and the helium pressure of 760 torr. Generally, the temperature data obtained by the FE analysis are slightly lower than those of the measured except the step 1 of the capsule. However, the temperature difference between the measured and the calculated shows a good agreement within 9 percent. It is

  16. Influence of parafunctional loading and prosthetic connection on stress distribution: a 3D finite element analysis.

    Science.gov (United States)

    Torcato, Leonardo Bueno; Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; Falcón-Antenucci, Rosse Mary; Santiago Júnior, Joel Ferreira; de Faria Almeida, Daniel Augusto

    2015-11-01

    Clinicians should consider parafunctional occlusal load when planning treatment. Prosthetic connections can reduce the stress distribution on an implant-supported prosthesis. The purpose of this 3-dimensional finite element study was to assess the influence of parafunctional loading and prosthetic connections on stress distribution. Computer-aided design software was used to construct 3 models. Each model was composed of a bone and an implant (external hexagon, internal hexagon, or Morse taper) with a crown. Finite element analysis software was used to generate the finite element mesh and establish the loading and boundary conditions. A normal force (200-N axial load and 100-N oblique load) and parafunctional force (1000-N axial and 500-N oblique load) were applied. Results were visualized as the maximum principal stress. Three-way analysis of variance and Tukey test were performed, and the percentage of contribution of each variable to the stress concentration was calculated from sum-of squares-analysis. Stress was concentrated around the implant at the cortical bone, and models with the external hexagonal implant showed the highest stresses (PProsthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  17. Theoretical basis for transfer of laboratory test results of grain size distribution of coal to real object

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, W; Chodura, J [Politechnika Sladska, Gliwice (Poland). Instytut Mechanizacji Gornictwa

    1989-01-01

    Evaluates a method for forecasting size distribution of black coal mined by shearer loaders in one coal seam. Laboratory tests for determining coal comminution during cutting and haulage along the face are analyzed. Methods for forecasting grain size distribution of coal under operational conditions using formulae developed on the basis of laboratory tests are discussed. Recommendations for design of a test stand and test conditions are discussed. A laboratory stand should accurately model operational conditions of coal cutting, especially dimensions of the individual elements of the shearer loader, geometry of the cutting drum and cutting tools, and strength characteristics of the coal seam. 9 refs.

  18. The Mann-Whitney U: A Test for Assessing Whether Two Independent Samples Come from the Same Distribution

    Directory of Open Access Journals (Sweden)

    Nadim Nachar

    2008-03-01

    Full Text Available It is often difficult, particularly when conducting research in psychology, to have access to large normally distributed samples. Fortunately, there are statistical tests to compare two independent groups that do not require large normally distributed samples. The Mann-Whitney U is one of these tests. In the following work, a summary of this test is presented. The explanation of the logic underlying this test and its application are presented. Moreover, the forces and weaknesses of the Mann-Whitney U are mentioned. One major limit of the Mann-Whitney U is that the type I error or alpha (? is amplified in a situation of heteroscedasticity.

  19. Analysis of critical operating conditions for LV distribution networks with microgrids

    Science.gov (United States)

    Zehir, M. A.; Batman, A.; Sonmez, M. A.; Font, A.; Tsiamitros, D.; Stimoniaris, D.; Kollatou, T.; Bagriyanik, M.; Ozdemir, A.; Dialynas, E.

    2016-11-01

    Increase in the penetration of Distributed Generation (DG) in distribution networks, raises the risk of voltage limit violations while contributing to line losses. Especially in low voltage (LV) distribution networks (secondary distribution networks), impacts of active power flows on the bus voltages and on the network losses are more dominant. As network operators must meet regulatory limitations, they have to take into account the most critical operating conditions in their systems. In this study, it is aimed to present the impact of the worst operation cases of LV distribution networks comprising microgrids. Simulation studies are performed on a field data-based virtual test-bed. The simulations are repeated for several cases consisting different microgrid points of connection with different network loading and microgrid supply/demand conditions.

  20. A study on thermal ratcheting structure test of 316L test cylinder

    International Nuclear Information System (INIS)

    Lee, H. Y.; Kim, J. B.; Koo, G. H.

    2001-01-01

    In this study, the progressive inelastic deformation, so called, thermal ratchet phenomenon which can occur in high temperature liquid metal reactor was simulated with thermal ratchet structural test facility and 316L stainless steel test cylinder. The inelastic deformation of the reactor baffle cylinder can occur due to the moving temperature distribution along the axial direction as the hot free surface moves up and down under the cyclic heat-up and cool-down of reactor operations. The ratchet deformations were measured with the laser displacement sensor and LVDTs after cooling the structural specimen which experiences thermal load up to 550 .deg. C and the temperature differences of about 500 .deg. C. During structural thermal ratchet test, the temperature distribution of the test cylinder along the axial direction was measured from 28 channels of thermocouples and the temperatures were used for the ratchet analysis. The thermal ratchet deformation analysis was performed with the NONSTA code whose constitutive model is nonlinear combined kinematic and isotropic hardening model and the test results were compared with those of the analysis. Thermal ratchet test was carried out with respect to 9 cycles of thermal loading and the maximum residual displacements were measured to be 1.8mm. It was shown that thermal ratchet load can cause a progressive deformation to the reactor structure. The analysis results with the combined hardening model were in reasonable agreement with those of the tests

  1. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  2. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  3. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  4. Distribution and presentation of Lyme borreliosis in Scotland - analysis of data from a national testing laboratory.

    Science.gov (United States)

    Mavin, S; Watson, E J; Evans, R

    2015-01-01

    This study examines the distribution of laboratory-confirmed cases of Lyme borreliosis in Scotland and the clinical spectrum of presentations within NHS Highland. Methods General demographic data (age/sex/referring Health Board) from all cases of Lyme borreliosis serologically confirmed by the National Lyme Borreliosis Testing Laboratory from 1 January 2008 to 31 December 2013 were analysed. Clinical features of confirmed cases were ascertained from questionnaires sent to referring clinicians within NHS Highland during the study period. Results The number of laboratory-confirmed cases of Lyme borreliosis in Scotland peaked at 440 in 2010. From 2008 to 2013 the estimated average annual incidence was 6.8 per 100,000 (44.1 per 100,000 in NHS Highland). Of 594 questionnaires from NHS Highland patients: 76% had clinically confirmed Lyme borreliosis; 48% erythema migrans; 17% rash, 25% joint, 15% neurological and 1% cardiac symptoms. Only 61% could recall a tick bite. Conclusion The incidence of Lyme borreliosis may be stabilising in Scotland but NHS Highland remains an area of high incidence. Lyme borreliosis should be considered in symptomatic patients that have had exposure to ticks and not just those with a definite tick bite.

  5. Choreographer Pre-Testing Code Analysis and Operational Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, David J. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Harrison, Christopher B. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Perr, C. W. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Hurd, Steven A [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2014-07-01

    Choreographer is a "moving target defense system", designed to protect against attacks aimed at IP addresses without corresponding domain name system (DNS) lookups. It coordinates actions between a DNS server and a Network Address Translation (NAT) device to regularly change which publicly available IP addresses' traffic will be routed to the protected device versus routed to a honeypot. More details about how Choreographer operates can be found in Section 2: Introducing Choreographer. Operational considerations for the successful deployment of Choreographer can be found in Section 3. The Testing & Evaluation (T&E) for Choreographer involved 3 phases: Pre-testing, Code Analysis, and Operational Testing. Pre-testing, described in Section 4, involved installing and configuring an instance of Choreographer and verifying it would operate as expected for a simple use case. Our findings were that it was simple and straightforward to prepare a system for a Choreographer installation as well as configure Choreographer to work in a representative environment. Code Analysis, described in Section 5, consisted of running a static code analyzer (HP Fortify) and conducting dynamic analysis tests using the Valgrind instrumentation framework. Choreographer performed well, such that only a few errors that might possibly be problematic in a given operating situation were identified. Operational Testing, described in Section 6, involved operating Choreographer in a representative environment created through EmulyticsTM . Depending upon the amount of server resources dedicated to Choreographer vis-á-vis the amount of client traffic handled, Choreographer had varying degrees of operational success. In an environment with a poorly resourced Choreographer server and as few as 50-100 clients, Choreographer failed to properly route traffic over half the time. Yet, with a well-resourced server, Choreographer handled over 1000 clients without missrouting. Choreographer

  6. Two-dimensional goodness-of-fit testing in astronomy

    International Nuclear Information System (INIS)

    Peacock, J.A

    1983-01-01

    This paper deals with the techniques available to test for consistency between the empirical distribution of data points on a plane and a hypothetical density law. Two new statistical tests are developed. The first is a two-dimensional version of the Kolmogorov-Smirnov test, for which the distribution of the test statistic is investigated using a Monte Carlo method. This test is found in practice to be very nearly distribution-free, and empirical formulae for the confidence levels are given. Secondly, the method of power-spectrum analysis is extended to deal with cases in which the null hypothesis is not a uniform distribution. These methods are illustrated by application to the distribution of quasar candidates found on an objective-prism plate of the Virgo Cluster. (author)

  7. A distributed microcomputer-controlled system for data acquisition and power spectral analysis of EEG.

    Science.gov (United States)

    Vo, T D; Dwyer, G; Szeto, H H

    1986-04-01

    A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.

  8. Verification of in-core thermal and hydraulic analysis code FLOWNET/TRUMP for the high temperature engineering test reactor (HTTR) at JAERI

    International Nuclear Information System (INIS)

    Maruyama, Soh; Sudo, Yukio; Saito, Shinzo; Kiso, Yoshihiro; Hayakawa, Hitoshi

    1989-01-01

    The FLOWNET/TRUMP code consists of a flow network analysis code 'FLOWNET' for calculations of coolant flow distribution and coolant temperature distribution in the core with a thermal conduction analysis code 'TRUMP' for calculation of temperature distribution in solid structures. The verification of FLOWNET/TRUMP was made by the comparison of the analytical results with the results of steady state experiments by the HENDEL multichannel test rig, T1-M, which consisted of twelve simulated fuel rods heated electrically and eleven hexagonal graphite fuel blocks. The T1-M simulated the one fuel column in the core. The analytical results agreed well with the results of the experiment in which the HTTR operating conditions were simulated. (orig.)

  9. Accident analysis of HANARO fuel test loop

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Chi, D. Y

    1998-03-01

    Steady state fuel test loop will be equipped in HANARO to obtain the development and betterment of advanced fuel and materials through the irradiation tests. The HANARO fuel test loop was designed to match the CANDU and PWR fuel operating conditions. The accident analysis was performed by RELAP5/MOD3 code based on FTL system designs and determined the detail engineering specification of in-pile test section and out-pile systems. The accident analysis results of FTL system could be used for the fuel and materials designer to plan the irradiation testing programs. (author). 23 refs., 20 tabs., 178 figs.

  10. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  12. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis......-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute....... An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has...

  13. Temporal Statistical Analysis of Degree Distributions in an Undirected Landline Phone Call Network Graph Series

    Directory of Open Access Journals (Sweden)

    Orgeta Gjermëni

    2017-10-01

    Full Text Available This article aims to provide new results about the intraday degree sequence distribution considering phone call network graph evolution in time. More specifically, it tackles the following problem. Given a large amount of landline phone call data records, what is the best way to summarize the distinct number of calling partners per client per day? In order to answer this question, a series of undirected phone call network graphs is constructed based on data from a local telecommunication source in Albania. All network graphs of the series are simplified. Further, a longitudinal temporal study is made on this network graphs series related to the degree distributions. Power law and log-normal distribution fittings on the degree sequence are compared on each of the network graphs of the series. The maximum likelihood method is used to estimate the parameters of the distributions, and a Kolmogorov–Smirnov test associated with a p-value is used to define the plausible models. A direct distribution comparison is made through a Vuong test in the case that both distributions are plausible. Another goal was to describe the parameters’ distributions’ shape. A Shapiro-Wilk test is used to test the normality of the data, and measures of shape are used to define the distributions’ shape. Study findings suggested that log-normal distribution models better the intraday degree sequence data of the network graphs. It is not possible to say that the distributions of log-normal parameters are normal.

  14. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  15. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  16. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  17. Sodium leakage and combustion tests. Measurement and distribution of droplet size using various spray nozzles

    International Nuclear Information System (INIS)

    Nagai, Keiichi; Hirabayashi, Masaru; Onojima, T.; Gunji, Minoru; Ara, Kuniaki; Oki, Yoshihisa

    1999-04-01

    In order to develop a numerical code simulating sodium fires initiated frame dispersion of droplets, measured data of droplet diameter as well as its distribution are needed. In the present experiment the distribution of droplet diameter was measured using water, oil and sodium. The tests elucidated the influential factors with respect to the droplet diameter. In addition, we sought to develop a similarity law between water and sodium. The droplet size distribution of sodium using the large diameter droplet (Elnozzle) was predicted. (J.P.N.)

  18. Modeling of the CIGRE Low Voltage Test Distribution Network and the Development of Appropriate Controllers

    DEFF Research Database (Denmark)

    Mustafa, Ghullam; Bak-Jensen, Birgitte; Mahat, Pukar

    2013-01-01

    The fluctuating nature of some of the Distributed Generation (DG) sources can cause power quality related problems like power frequency oscillations, voltage fluctuations etc. In future, the DG penetration is expected to increase and hence this requires some control actions to deal with the power...... controller. The control system is tested in the distribution test network set up by CIGRE. The new approach of the PV controller is done in such a way that it can control AC and DC voltage of the PV converter during dynamic conditions. The battery controller is also developed in such a way that it can...... quality issues. The main focus of this paper is on development of controllers for a distribution system with different DG’s and especially development of a Photovoltaic (PV) controller using a Static Compensator (STATCOM) controller and on modeling of a Battery Storage System (BSS) also based on a STATCOM...

  19. Distributed resistance model for the analysis of wire-wrapped rod bundles

    International Nuclear Information System (INIS)

    Ha, K. S.; Jung, H. Y.; Kwon, Y. M.; Jang, W. P.; Lee, Y. B.

    2003-01-01

    A partial flow blockage within a fuel assembly in liquid metal reactor may result in localized boiling or a failure of the fuel cladding. Thus, the precise analysis for the phenomenon is required for a safe design of LMR. MATRA-LMR code developed by KAERI models the flow distribution in an assembly by using the wire forcing function to consider the effects of wire-wrap spacers, which is important to the analysis for flow blockage. However, the wire forcing function does not have the capabilities of analysis when the flow blockage is occurred. And thus this model was altered to the distributed resistance model and the validation calculation was carried out against to the experiment of FFM 2A

  20. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  1. Distribution and histologic effects of intravenously administered amorphous nanosilica particles in the testes of mice

    International Nuclear Information System (INIS)

    Morishita, Yuki; Yoshioka, Yasuo; Satoh, Hiroyoshi; Nojiri, Nao; Nagano, Kazuya; Abe, Yasuhiro; Kamada, Haruhiko; Tsunoda, Shin-ichi; Nabeshi, Hiromi; Yoshikawa, Tomoaki; Tsutsumi, Yasuo

    2012-01-01

    Highlights: ► There is rising concern regarding the potential health risks of nanomaterials. ► Few studies have investigated the effect of nanomaterials on the reproductive system. ► Here, we evaluated the intra-testicular distribution of nanosilica particles. ► We showed that nanosilica particles can penetrate the blood-testis barrier. ► These data provide basic information on ways to create safer nanomaterials. -- Abstract: Amorphous nanosilica particles (nSP) are being utilized in an increasing number of applications such as medicine, cosmetics, and foods. The reduction of the particle size to the nanoscale not only provides benefits to diverse scientific fields but also poses potential risks. Several reports have described the in vivo and in vitro toxicity of nSP, but few studies have examined their effects on the male reproductive system. The aim of this study was to evaluate the testicular distribution and histologic effects of systemically administered nSP. Mice were injected intravenously with nSP with diameters of 70 nm (nSP70) or conventional microsilica particles with diameters of 300 nm (nSP300) on two consecutive days. The intratesticular distribution of these particles 24 h after the second injection was analyzed by transmission electron microscopy. nSP70 were detected within sertoli cells and spermatocytes, including in the nuclei of spermatocytes. No nSP300 were observed in the testis. Next, mice were injected intravenously with 0.4 or 0.8 mg nSP70 every other day for a total of four administrations. Testes were harvested 48 h and 1 week after the last injection and stained with hematoxylin–eosin for histologic analysis. Histologic findings in the testes of nSP70-treated mice did not differ from those of control mice. Taken together, our results suggest that nSP70 can penetrate the blood-testis barrier and the nuclear membranes of spermatocytes without producing apparent testicular injury.

  2. Distribution and histologic effects of intravenously administered amorphous nanosilica particles in the testes of mice

    Energy Technology Data Exchange (ETDEWEB)

    Morishita, Yuki [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Yoshioka, Yasuo, E-mail: yasuo@phs.osaka-u.ac.jp [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Satoh, Hiroyoshi; Nojiri, Nao [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Nagano, Kazuya [Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); Abe, Yasuhiro [Cancer Biology Research Center, Sanford Research/USD, 2301 E. 60th Street N, Sioux Falls, SD 57104 (United States); Kamada, Haruhiko; Tsunoda, Shin-ichi [Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Nabeshi, Hiromi [Division of Foods, National Institute of Health Sciences, 1-18-1, Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Yoshikawa, Tomoaki [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Tsutsumi, Yasuo, E-mail: ytsutsumi@phs.osaka-u.ac.jp [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2012-04-06

    Highlights: Black-Right-Pointing-Pointer There is rising concern regarding the potential health risks of nanomaterials. Black-Right-Pointing-Pointer Few studies have investigated the effect of nanomaterials on the reproductive system. Black-Right-Pointing-Pointer Here, we evaluated the intra-testicular distribution of nanosilica particles. Black-Right-Pointing-Pointer We showed that nanosilica particles can penetrate the blood-testis barrier. Black-Right-Pointing-Pointer These data provide basic information on ways to create safer nanomaterials. -- Abstract: Amorphous nanosilica particles (nSP) are being utilized in an increasing number of applications such as medicine, cosmetics, and foods. The reduction of the particle size to the nanoscale not only provides benefits to diverse scientific fields but also poses potential risks. Several reports have described the in vivo and in vitro toxicity of nSP, but few studies have examined their effects on the male reproductive system. The aim of this study was to evaluate the testicular distribution and histologic effects of systemically administered nSP. Mice were injected intravenously with nSP with diameters of 70 nm (nSP70) or conventional microsilica particles with diameters of 300 nm (nSP300) on two consecutive days. The intratesticular distribution of these particles 24 h after the second injection was analyzed by transmission electron microscopy. nSP70 were detected within sertoli cells and spermatocytes, including in the nuclei of spermatocytes. No nSP300 were observed in the testis. Next, mice were injected intravenously with 0.4 or 0.8 mg nSP70 every other day for a total of four administrations. Testes were harvested 48 h and 1 week after the last injection and stained with hematoxylin-eosin for histologic analysis. Histologic findings in the testes of nSP70-treated mice did not differ from those of control mice. Taken together, our results suggest that nSP70 can penetrate the blood-testis barrier and the

  3. TRACG post-test analysis of panthers prototype tests of SBWR passive containment condenser

    International Nuclear Information System (INIS)

    Fitch, J.R.; Billig, P.F.; Abdollahian, D.; Masoni, P.

    1997-01-01

    As part of the validation effort for application of the TRACG code to the Simplified Boiling Water Reactor (SBWR), calculations have been performed for the various test facilities which are part of the SBWR design and technology certification program. These calculations include post-test calculations for tests in the PANTHERS Passive Containment Condenser (PCC) test program. Sixteen tests from the PANTHERS/PCC test matrix were selected for post-test analysis. This set includes three steady-state pure-steam tests, nine steady-state steam-air tests, and four transient tests. The purpose of this paper is to present and discuss the results of the post-test analysis. The author includes a brief description of the PANTHERS/PCC test facility and test matrix, a description of the PANTHERS/PCC post-test TRACG model and the manner in which the various types of tests in the post-test evaluation were simulated, and a presentation of the results of the TRACG simulation

  4. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  5. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  6. [The relationship between Ridit analysis and rank sum test for one-way ordinal contingency table in medical research].

    Science.gov (United States)

    Wang, Ling; Xia, Jie-lai; Yu, Li-li; Li, Chan-juan; Wang, Su-zhen

    2008-06-01

    To explore several numerical methods of ordinal variable in one-way ordinal contingency table and their interrelationship, and to compare corresponding statistical analysis methods such as Ridit analysis and rank sum test. Formula deduction was based on five simplified grading approaches including rank_r(i), ridit_r(i), ridit_r(ci), ridit_r(mi), and table scores. Practical data set was verified by SAS8.2 in clinical practice (to test the effect of Shiwei solution in treatment for chronic tracheitis). Because of the linear relationship of rank_r(i) = N ridit_r(i) + 1/2 = N ridit_r(ci) = (N + 1) ridit_r(mi), the exact chi2 values in Ridit analysis based on ridit_r(i), ridit_r(ci), and ridit_r(mi), were completely the same, and they were equivalent to the Kruskal-Wallis H test. Traditional Ridit analysis was based on ridit_r(i), and its corresponding chi2 value calculated with an approximate variance (1/12) was conservative. The exact chi2 test of Ridit analysis should be used when comparing multiple groups in the clinical researches because of its special merits such as distribution of mean ridit value on (0,1) and clear graph expression. The exact chi2 test of Ridit analysis can be output directly by proc freq of SAS8.2 with ridit and modridit option (SCORES =). The exact chi2 test of Ridit analysis is equivalent to the Kruskal-Wallis H test, and should be used when comparing multiple groups in the clinical researches.

  7. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  8. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  9. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  10. Analysis of crack initiation and growth in the high level vibration test at Tadotsu

    International Nuclear Information System (INIS)

    Kassir, M.K.; Hofmayer, C.H.; Bandyopadhyay, K.K.

    1991-01-01

    A High Level Vibration Test (HLVT) Program was carried out recently on the seismic table at the Tadotsu Engineering Laboratory of Nuclear Power Engineering Center (NUPEC) in Japan. The objective of the study being performed at Brookhaven National Laboratory is to use the HLVT data to assess the accuracy and usefulness of existing methods for predicting crack initiation and growth under complex, large amplitude loading. The work to be performed as part of this effort involves: (1) analysis of the stress/strain distribution in the vicinity of the crack, including the potential for residual stresses due to the weld repair; (2) analysis of the number of load cycles required for crack initiation, including estimates of the impact of the weld repair on the crack initiation behavior; (3) analysis of crack advance as a function of applied loading (classic fatigue versus cyclic tearing) taking into account the variable amplitude loading and the possible influence of the repair; and (4) material property testing to supplement the work performed as part of the HLVT, providing the materials data necessary to perform the analysis efforts. A summary of research progress for FY 1990 is presented. 2 refs

  11. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  12. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  13. Mixture distributions of wind speed in the UAE

    Science.gov (United States)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for

  14. A Cryogenic Test Set-Up for the Qualification of Pre-Series Test Cells for the LHC Cryogenic Distribution Line

    CERN Document Server

    Livran, J; Parente, C; Riddone, G; Rybkowski, D; Veillet, N

    2000-01-01

    Three pre-series Test Cells of the LHC Cryogenic Distribution Line (QRL) [1], manufactured by three European industrial companies, will be tested in the year 2000 to qualify the design chosen and verify the thermal and mechanical performances. A dedicated test stand (170 m x 13 m) has been built for extensive testing and performance assessment of the pre-series units in parallel. They will be fed with saturated liquid helium at 4.2 K supplied by a mobile helium dewar. In addition, LN2 cooled helium will be used for cool-down and thermal shielding. For each of the three pre-series units, a set of end boxes has been designed and manufactured at CERN. This paper presents the layout of the cryogenic system for the pre-series units, the calorimetric methods as well as the results of the thermal calculation of the end box test.

  15. Preliminary test results and CFD analysis for moderator circulation test at Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, H.T. [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of); Im, S.H.; Sung, H.J. [Korea Advanced Inst. of Science and Tech., Daejeon (Korea, Republic of); Seo, H.; Bang, I.C. [Ulsan National Inst. of Science and Tech., Ulsan (Korea, Republic of)

    2014-07-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  16. Preliminary test results and CFD analysis for moderator circulation test at Korea

    International Nuclear Information System (INIS)

    Kim, H.T.; Im, S.H.; Sung, H.J.; Seo, H.; Bang, I.C.

    2014-01-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  17. Modeled ground water age distributions

    Science.gov (United States)

    Woolfenden, Linda R.; Ginn, Timothy R.

    2009-01-01

    The age of ground water in any given sample is a distributed quantity representing distributed provenance (in space and time) of the water. Conventional analysis of tracers such as unstable isotopes or anthropogenic chemical species gives discrete or binary measures of the presence of water of a given age. Modeled ground water age distributions provide a continuous measure of contributions from different recharge sources to aquifers. A numerical solution of the ground water age equation of Ginn (1999) was tested both on a hypothetical simplified one-dimensional flow system and under real world conditions. Results from these simulations yield the first continuous distributions of ground water age using this model. Complete age distributions as a function of one and two space dimensions were obtained from both numerical experiments. Simulations in the test problem produced mean ages that were consistent with the expected value at the end of the model domain for all dispersivity values tested, although the mean ages for the two highest dispersivity values deviated slightly from the expected value. Mean ages in the dispersionless case also were consistent with the expected mean ages throughout the physical model domain. Simulations under real world conditions for three dispersivity values resulted in decreasing mean age with increasing dispersivity. This likely is a consequence of an edge effect. However, simulations for all three dispersivity values tested were mass balanced and stable demonstrating that the solution of the ground water age equation can provide estimates of water mass density distributions over age under real world conditions.

  18. A Lego Mindstorms NXT based test bench for multiagent exploratory systems and distributed network partitioning

    Science.gov (United States)

    Patil, Riya Raghuvir

    Networks of communicating agents require distributed algorithms for a variety of tasks in the field of network analysis and control. For applications such as swarms of autonomous vehicles, ad hoc and wireless sensor networks, and such military and civilian applications as exploring and patrolling a robust autonomous system that uses a distributed algorithm for selfpartitioning can be significantly helpful. A single team of autonomous vehicles in a field may need to self-dissemble into multiple teams, conducive to completing multiple control tasks. Moreover, because communicating agents are subject to changes, namely, addition or failure of an agent or link, a distributed or decentralized algorithm is favorable over having a central agent. A framework to help with the study of self-partitioning of such multi agent systems that have most basic mobility model not only saves our time in conception but also gives us a cost effective prototype without negotiating the physical realization of the proposed idea. In this thesis I present my work on the implementation of a flexible and distributed stochastic partitioning algorithm on the LegoRTM Mindstorms' NXT on a graphical programming platform using National Instruments' LabVIEW(TM) forming a team of communicating agents via NXT-Bee radio module. We single out mobility, communication and self-partition as the core elements of the work. The goal is to randomly explore a precinct for reference sites. Agents who have discovered the reference sites announce their target acquisition to form a network formed based upon the distance of each agent with the other wherein the self-partitioning begins to find an optimal partition. Further, to illustrate the work, an experimental test-bench of five Lego NXT robots is presented.

  19. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...

  20. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  1. Testing species distribution models across space and time: high latitude butterflies and recent warming

    DEFF Research Database (Denmark)

    Eskildsen, Anne; LeRoux, Peter C.; Heikkinen, Risto K.

    2013-01-01

    changes at expanding range margins can be predicted accurately. Location. Finland. Methods. Using 10-km resolution butterfly atlas data from two periods, 1992–1999 (t1) and 2002–2009 (t2), with a significant between-period temperature increase, we modelled the effects of climatic warming on butterfly...... butterfly distributions under climate change. Model performance was lower with independent compared to non-independent validation and improved when land cover and soil type variables were included, compared to climate-only models. SDMs performed less well for highly mobile species and for species with long......Aim. To quantify whether species distribution models (SDMs) can reliably forecast species distributions under observed climate change. In particular, to test whether the predictive ability of SDMs depends on species traits or the inclusion of land cover and soil type, and whether distributional...

  2. ALOAD - a code to determine the concentrated forces equivalent with a distributed pressure field for a FEM analysis

    Directory of Open Access Journals (Sweden)

    Nicolae APOSTOLESCU

    2010-12-01

    Full Text Available The main objective of this paper is to describe a code for calculating an equivalent systemof concentrate loads for a FEM analysis. The tables from the Aerodynamic Department containpressure field for a whole bearing surface, and integrated quantities both for the whole surface andfor fixed and mobile part. Usually in a FEM analysis the external loads as concentrated loadsequivalent to the distributed pressure field are introduced. These concentrated forces can also be usedin static tests. Commercial codes provide solutions for this problem, but what we intend to develop isa code adapted to the user’s specific needs.

  3. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  4. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    Science.gov (United States)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  5. Benford's law first significant digit and distribution distances for testing the reliability of financial reports in developing countries

    Science.gov (United States)

    Shi, Jing; Ausloos, Marcel; Zhu, Tingting

    2018-02-01

    We discuss a common suspicion about reported financial data, in 10 industrial sectors of the 6 so called "main developing countries" over the time interval [2000-2014]. These data are examined through Benford's law first significant digit and through distribution distances tests. It is shown that several visually anomalous data have to be a priori removed. Thereafter, the distributions much better follow the first digit significant law, indicating the usefulness of a Benford's law test from the research starting line. The same holds true for distance tests. A few outliers are pointed out.

  6. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Science.gov (United States)

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  7. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Directory of Open Access Journals (Sweden)

    Jeff Alstott

    Full Text Available Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  8. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  9. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Emma [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McParland, Charles [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Roberts, Ciaran [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  10. A GRID solution for gravitational waves signal analysis from coalescing binaries: performances of test algorithms and further developments

    International Nuclear Information System (INIS)

    Acernese, A; Barone, F; Rosa, R De; Esposito, R; Frasca, S; Mastroserio, P; Milano, L; Palomba, C; Pardi, S; Qipiani, K; Ricci, F; Russo, G

    2004-01-01

    The analysis of data coming from interferometric antennas for gravitational wave detection requires a huge amount of computing power. The usual approach to the detection strategy is to set up computer farms able to perform several tasks in parallel, exchanging data through network links. In this paper a new computation strategy based on the GRID environment, is presented. The GRID environment allows several geographically distributed computing resources to exchange data and programs in a secure way, using standard infrastructures. The computing resources can be geographically distributed also on a large scale. Some preliminary tests were performed using a subnetwork of the GRID infrastructure, producing good results in terms of distribution efficiency and time duration

  11. Influence of particle size distribution on the analysis of pellets of plant materials by laser-induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    Gustinelli Arantes de Carvalho, Gabriel; Santos Jr, Dário; Silva Gomes, Marcos da; Nunes, Lidiane Cristina; Guerra, Marcelo Braga Bueno; Krug, Francisco José

    2015-01-01

    Pellets of sieved plant materials (150, 106, 75, 53 and 20 μm sieve apertures) were prepared and analyzed by laser-induced breakdown spectroscopy (LIBS), and the results for Ca, K, Mg, P, B and Mn were discussed as a function of particle size distribution. This parameter is of key importance for appropriate test sample presentation in the form of pressed pellets for quantitative analysis by LIBS. Experiments were carried out with a Q-switched Nd:YAG laser at 1064 nm, and a spectrometer with Echelle optics and an intensified charge-coupled device. Results indicated that smaller particles yielded up to 50% emission signal intensities' enhancement and attained better measurements' precision (site-to-site variation). Moreover, matrix effects were reduced by analyzing pellets prepared from < 75 μm sieved fractions (mean particle size = 32 μm; d 95 = 102 μm) and by using a 50 J cm −2 laser fluence (220 mJ per pulse; 750 μm laser spot size). The preparation of pellets from laboratory samples with monomodal particle size distributions, where most particles were smaller than 100 μm, was decisive for improving analyte micro-homogeneity within the test samples and for attaining lower coefficients of variation of measurements, typically lower than 10% (n = 10 sites per pellet; 20 laser pulses per site). - Highlights: • First systematic study on the effects of particle size distribution. • Most appropriate particle sizes for pellet preparation depend on laser fluence. • Data can be used for sampling strategies aiming at LIBS analysis of plant materials

  12. Influence of particle size distribution on the analysis of pellets of plant materials by laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Gustinelli Arantes de Carvalho, Gabriel [NAPTISA Research Support Center “Technology and Innovation for a Sustainable Agriculture”, Center for Nuclear Energy in Agriculture, University of São Paulo, Av. Centenário 303, 13416-000 Piracicaba, SP (Brazil); Santos Jr, Dário [Departamento de Ciências Exatas e da Terra, Universidade Federal de São Paulo, Rua Prof. Artur Riedel, 275, 09972-270 Diadema, SP (Brazil); Silva Gomes, Marcos da; Nunes, Lidiane Cristina; Guerra, Marcelo Braga Bueno [NAPTISA Research Support Center “Technology and Innovation for a Sustainable Agriculture”, Center for Nuclear Energy in Agriculture, University of São Paulo, Av. Centenário 303, 13416-000 Piracicaba, SP (Brazil); Krug, Francisco José, E-mail: fjkrug@cena.usp.br [NAPTISA Research Support Center “Technology and Innovation for a Sustainable Agriculture”, Center for Nuclear Energy in Agriculture, University of São Paulo, Av. Centenário 303, 13416-000 Piracicaba, SP (Brazil)

    2015-03-01

    Pellets of sieved plant materials (150, 106, 75, 53 and 20 μm sieve apertures) were prepared and analyzed by laser-induced breakdown spectroscopy (LIBS), and the results for Ca, K, Mg, P, B and Mn were discussed as a function of particle size distribution. This parameter is of key importance for appropriate test sample presentation in the form of pressed pellets for quantitative analysis by LIBS. Experiments were carried out with a Q-switched Nd:YAG laser at 1064 nm, and a spectrometer with Echelle optics and an intensified charge-coupled device. Results indicated that smaller particles yielded up to 50% emission signal intensities' enhancement and attained better measurements' precision (site-to-site variation). Moreover, matrix effects were reduced by analyzing pellets prepared from < 75 μm sieved fractions (mean particle size = 32 μm; d{sub 95} = 102 μm) and by using a 50 J cm{sup −2} laser fluence (220 mJ per pulse; 750 μm laser spot size). The preparation of pellets from laboratory samples with monomodal particle size distributions, where most particles were smaller than 100 μm, was decisive for improving analyte micro-homogeneity within the test samples and for attaining lower coefficients of variation of measurements, typically lower than 10% (n = 10 sites per pellet; 20 laser pulses per site). - Highlights: • First systematic study on the effects of particle size distribution. • Most appropriate particle sizes for pellet preparation depend on laser fluence. • Data can be used for sampling strategies aiming at LIBS analysis of plant materials.

  13. Risk analysis for decision support in electricity distribution system asset management: methods and frameworks for analysing intangible risks

    Energy Technology Data Exchange (ETDEWEB)

    Nordgaard, Dag Eirik

    2010-04-15

    During the last 10 to 15 years electricity distribution companies throughout the world have been ever more focused on asset management as the guiding principle for their activities. Within asset management, risk is a key issue for distribution companies, together with handling of cost and performance. There is now an increased awareness of the need to include risk analyses into the companies' decision making processes. Much of the work on risk in electricity distribution systems has focused on aspects of reliability. This is understandable, since it is surely an important feature of the product delivered by the electricity distribution infrastructure, and it is high on the agenda for regulatory authorities in many countries. However, electricity distribution companies are also concerned with other risks relevant for their decision making. This typically involves intangible risks, such as safety, environmental impacts and company reputation. In contrast to the numerous methodologies developed for reliability risk analysis, there are relatively few applications of structured analyses to support decisions concerning intangible risks, even though they represent an important motivation for decisions taken in electricity distribution companies. The overall objective of this PhD work has been to explore risk analysis methods that can be used to improve and support decision making in electricity distribution system asset management, with an emphasis on the analysis of intangible risks. The main contributions of this thesis can be summarised as: An exploration and testing of quantitative risk analysis (QRA) methods to support decisions concerning intangible risks; The development of a procedure for using life curve models to provide input to QRA models; The development of a framework for risk-informed decision making where QRA are used to analyse selected problems; In addition, the results contribute to clarify the basic concepts of risk, and highlight challenges

  14. Development of Ada language control software for the NASA power management and distribution test bed

    Science.gov (United States)

    Wright, Ted; Mackin, Michael; Gantose, Dave

    1989-01-01

    The Ada language software developed to control the NASA Lewis Research Center's Power Management and Distribution testbed is described. The testbed is a reduced-scale prototype of the electric power system to be used on space station Freedom. It is designed to develop and test hardware and software for a 20-kHz power distribution system. The distributed, multiprocessor, testbed control system has an easy-to-use operator interface with an understandable English-text format. A simple interface for algorithm writers that uses the same commands as the operator interface is provided, encouraging interactive exploration of the system.

  15. Advanced Distribution Network Modelling with Distributed Energy Resources

    Science.gov (United States)

    O'Connell, Alison

    three-phase optimal power flow method is developed. The formulation has the capability to provide optimal solutions for distribution system control variables, for a chosen objective function, subject to required constraints. It can, therefore, be utilised for numerous technologies and applications. The three-phase optimal power flow is employed to manage various distributed resources, such as photovoltaics and storage, as well as distribution equipment, including tap changers and switches. The flexibility of the methodology allows it to be applied in both an operational and a planning capacity. The three-phase optimal power flow is employed in an operational planning capacity to determine volt-var curves for distributed photovoltaic inverters. The formulation finds optimal reactive power settings for a number of load and solar scenarios and uses these reactive power points to create volt-var curves. Volt-var curves are determined for 10 PV systems on a test feeder. A universal curve is also determined which is applicable to all inverters. The curves are validated by testing them in a power flow setting over a 24-hour test period. The curves are shown to provide advantages to the feeder in terms of reduction of voltage deviations and unbalance, with the individual curves proving to be more effective. It is also shown that adding a new PV system to the feeder only requires analysis for that system. In order to represent the uncertainties that inherently occur on distribution systems, an information gap decision theory method is also proposed and integrated into the three-phase optimal power flow formulation. This allows for robust network decisions to be made using only an initial prediction for what the uncertain parameter will be. The work determines tap and switch settings for a test network with demand being treated as uncertain. The aim is to keep losses below a predefined acceptable value. The results provide the decision maker with the maximum possible variation in

  16. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  17. Methodology for assessing the impacts of distributed generation interconnection

    Directory of Open Access Journals (Sweden)

    Luis E. Luna

    2011-06-01

    Full Text Available This paper proposes a methodology for identifying and assessing the impact of distributed generation interconnection on distribution systems using Monte Carlo techniques. This methodology consists of two analysis schemes: a technical analysis, which evaluates the reliability conditions of the distribution system; on the other hand, an economic analysis that evaluates the financial impacts on the electric utility and its customers, according to the system reliability level. The proposed methodology was applied to an IEEE test distribution system, considering different operation schemes for the distributed generation interconnection. The application of each one of these schemes provided significant improvements regarding the reliability and important economic benefits for the electric utility. However, such schemes resulted in negative profitability levels for certain customers, therefore, regulatory measures and bilateral contracts were proposed which would provide a solution for this kind of problem.

  18. Assessment of multi-dimensional analysis cacpacity of the MARS using the OECD-SETH PANDA tests

    International Nuclear Information System (INIS)

    Bae, S. W.; Jung, J. J.; Jung, B. D.

    2004-01-01

    The objectives of OECD/NEA-PANDA tests are to validate and assess computer codes that analyze the non-condensable gas concentrations and mixing phenomena in a reactor containment building. Especially, the main issue is multi-dimensional analysis capability which is involved in the mixing of non-condensable gases, i. e. hydrogen. The main tests consist of a superheated steam flow injection into a large vessel initially filled with air or air/helium mixtures. Then the temperature and concentration of noncondensable gases are measured. A pre-calculation has been performed with the MARS about PANDA Tests even though MARS is not a containment analysis code. Three cases among 25 PANDA Tests are selected and are modeled to simulate the jet plumes and air mixing in a large vessel. The dimensions of large vessel are 4 m diameter and 8 m height. For the conclusion of calculation, the cylindrical vessel which dimensions are 4 m diameter and 8 m height was simplified as rectangular geometry. It is revealed that the MARS code has the capability to distinguish the multi-dimensional distribution of the velocity and the temperature fields

  19. Cohesive Zone Model Based Numerical Analysis of Steel-Concrete Composite Structure Push-Out Tests

    Directory of Open Access Journals (Sweden)

    J. P. Lin

    2014-01-01

    Full Text Available Push-out tests were widely used to determine the shear bearing capacity and shear stiffness of shear connectors in steel-concrete composite structures. The finite element method was one efficient alternative to push-out testing. This paper focused on a simulation analysis of the interface between concrete slabs and steel girder flanges as well as the interface of the shear connectors and the surrounding concrete. A cohesive zone model was used to simulate the tangential sliding and normal separation of the interfaces. Then, a zero-thickness cohesive element was implemented via the user-defined element subroutine UEL in the software ABAQUS, and a multiple broken line mode was used to define the constitutive relations of the cohesive zone. A three-dimensional numerical analysis model was established for push-out testing to analyze the load-displacement curves of the push-out test process, interface relative displacement, and interface stress distribution. This method was found to accurately calculate the shear capacity and shear stiffness of shear connectors. The numerical results showed that the multiple broken lines mode cohesive zone model could describe the nonlinear mechanical behavior of the interface between steel and concrete and that a discontinuous deformation numerical simulation could be implemented.

  20. Generalization of the Lord-Wingersky Algorithm to Computing the Distribution of Summed Test Scores Based on Real-Number Item Scores

    Science.gov (United States)

    Kim, Seonghoon

    2013-01-01

    With known item response theory (IRT) item parameters, Lord and Wingersky provided a recursive algorithm for computing the conditional frequency distribution of number-correct test scores, given proficiency. This article presents a generalized algorithm for computing the conditional distribution of summed test scores involving real-number item…

  1. Correlation analysis for forced vibration test of the Hualien large scale seismic test (LSST) program

    International Nuclear Information System (INIS)

    Sugawara, Y.; Sugiyama, T.; Kobayashi, T.; Yamaya, H.; Kitamura, E.

    1995-01-01

    The correlation analysis for a forced vibration test of a 1/4-scale containment SSI test model constructed in Hualien, Taiwan was carried out for the case of after backfilling. Prior to this correlation analysis, the structural properties were revised to adjust the calculated fundamental frequency in the fixed base condition to that derived from the test results. A correlation analysis was carried out using the Lattice Model which was able to estimate the soil-structure effects with embedment. The analysis results coincide well with test results and it is concluded that the mathematical soil-structure interaction model established by the correlation analysis is efficient in estimating the dynamic soil-structure interaction effect with embedment. This mathematical model will be applied as a basic model for simulation analysis of earthquake observation records. (author). 3 refs., 12 figs., 2 tabs

  2. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  3. Space station electrical power distribution analysis using a load flow approach

    Science.gov (United States)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  4. Estimation of monthly solar radiation distribution for solar energy system analysis

    International Nuclear Information System (INIS)

    Coskun, C.; Oktay, Z.; Dincer, I.

    2011-01-01

    The concept of probability density frequency, which is successfully used for analyses of wind speed and outdoor temperature distributions, is now modified and proposed for estimating solar radiation distributions for design and analysis of solar energy systems. In this study, global solar radiation distribution is comprehensively analyzed for photovoltaic (PV) panel and thermal collector systems. In this regard, a case study is conducted with actual global solar irradiation data of the last 15 years recorded by the Turkish State Meteorological Service. It is found that intensity of global solar irradiance greatly affects energy and exergy efficiencies and hence the performance of collectors. -- Research highlights: → The first study to apply global solar radiation distribution in solar system analyzes. → The first study showing global solar radiation distribution as a parameter of the solar irradiance intensity. → Time probability intensity frequency and probability power distribution do not have similar distribution patterns for each month. → There is no relation between the distribution of annual time lapse and solar energy with the intensity of solar irradiance.

  5. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  6. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  7. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  8. Analysis of the international distribution of per capita CO2 emissions using the polarization concept

    International Nuclear Information System (INIS)

    Duro, Juan Antonio; Padilla, Emilio

    2008-01-01

    The concept of polarization is linked to the extent that a given distribution leads to the formation of homogeneous groups with opposing interests. This concept, which is basically different from the traditional one of inequality, is related to the level of inherent potential conflict in a distribution. The polarization approach has been widely applied in the analysis of income distribution. The extension of this approach to the analysis of international distribution of CO 2 emissions is quite useful as it gives a potent informative instrument for characterizing the state and evolution of the international distribution of emissions and its possible political consequences in terms of tensions and the probability of achieving agreements. In this paper we analyze the international distribution of per capita CO 2 emissions between 1971 and 2001 through the adaptation of the polarization concept and measures. We find that the most interesting grouped description deriving from the analysis is a two groups' one, which broadly coincide with Annex B and non-Annex B countries of the Kyoto Protocol, which shows the power of polarization analysis for explaining the generation of groups in the real world. The analysis also shows a significant reduction in international polarization in per capita CO 2 emissions between 1971 and 1995, but not much change since 1995, which might indicate that polarized distribution of emission is still one of the important factors leading to difficulties in achieving agreements for reducing global emissions. (author)

  9. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  10. Statistical flaw strength distributions for glass fibres: Correlation between bundle test and AFM-derived flaw size density functions

    International Nuclear Information System (INIS)

    Foray, G.; Descamps-Mandine, A.; R’Mili, M.; Lamon, J.

    2012-01-01

    The present paper investigates glass fibre flaw size distributions. Two commercial fibre grades (HP and HD) mainly used in cement-based composite reinforcement were studied. Glass fibre fractography is a difficult and time consuming exercise, and thus is seldom carried out. An approach based on tensile tests on multifilament bundles and examination of the fibre surface by atomic force microscopy (AFM) was used. Bundles of more than 500 single filaments each were tested. Thus a statistically significant database of failure data was built up for the HP and HD glass fibres. Gaussian flaw distributions were derived from the filament tensile strength data or extracted from the AFM images. The two distributions were compared. Defect sizes computed from raw AFM images agreed reasonably well with those derived from tensile strength data. Finally, the pertinence of a Gaussian distribution was discussed. The alternative Pareto distribution provided a fair approximation when dealing with AFM flaw size.

  11. Current, voltage and temperature distribution modeling of light-emitting diodes based on electrical and thermal circuit analysis

    International Nuclear Information System (INIS)

    Yun, J; Shim, J-I; Shin, D-S

    2013-01-01

    We demonstrate a modeling method based on the three-dimensional electrical and thermal circuit analysis to extract current, voltage and temperature distributions of light-emitting diodes (LEDs). In our model, the electrical circuit analysis is performed first to extract the current and voltage distributions in the LED. Utilizing the result obtained from the electrical circuit analysis as distributed heat sources, the thermal circuit is set up by using the duality between Fourier's law and Ohm's law. From the analysis of the thermal circuit, the temperature distribution at each epitaxial film is successfully obtained. Comparisons of experimental and simulation results are made by employing an InGaN/GaN multiple-quantum-well blue LED. Validity of the electrical circuit analysis is confirmed by comparing the light distribution at the surface. Since the temperature distribution at each epitaxial film cannot be obtained experimentally, the apparent temperature distribution is compared at the surface of the LED chip. Also, experimentally obtained average junction temperature is compared with the value calculated from the modeling, yielding a very good agreement. The analysis method based on the circuit modeling has an advantage of taking distributed heat sources as inputs, which is essential for high-power devices with significant self-heating. (paper)

  12. Wind Tunnel Tests for Wind Pressure Distribution on Gable Roof Buildings

    Science.gov (United States)

    2013-01-01

    Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path. PMID:24082851

  13. A MODEL OF HETEROGENEOUS DISTRIBUTED SYSTEM FOR FOREIGN EXCHANGE PORTFOLIO ANALYSIS

    Directory of Open Access Journals (Sweden)

    Dragutin Kermek

    2006-06-01

    Full Text Available The paper investigates the design of heterogeneous distributed system for foreign exchange portfolio analysis. The proposed model includes few separated and dislocated but connected parts through distributed mechanisms. Making system distributed brings new perspectives to performance busting where software based load balancer gets very important role. Desired system should spread over multiple, heterogeneous platforms in order to fulfil open platform goal. Building such a model incorporates different patterns from GOF design patterns, business patterns, J2EE patterns, integration patterns, enterprise patterns, distributed design patterns to Web services patterns. The authors try to find as much as possible appropriate patterns for planned tasks in order to capture best modelling and programming practices.

  14. MITG post-test analysis and design improvements

    International Nuclear Information System (INIS)

    Schock, A.

    1983-01-01

    The design, performance analysis, and key attributes of the Modular Isotopic Thermoelectric Generator (MITG) were described in a 1981 IECEC paper; and the design, fabrication, and testing of prototypical MITG test assemblies were described in preceding papers in these proceedings. Each test assembly simulated a typical modular slice of the flight generator. The present paper describes a detailed thermal-stress analysis, which identified the causes of stress-related problems observed during the tests. It then describes how additional analyses were used to evaluate design changes to alleviate those problems. Additional design improvements are discussed in the next paper in these proceedings, which also describes revised fabrication procedures and updated performance estimates for the generator

  15. A Study on the construction of 22.9 kV distribution test line of KEPCO

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hi Kyun; Choi, Hung Sik; Hwang, Si Dole; Jung, Young Ho [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center; Kim, Dong Hwan; Choi, Chang Huk [Korea Power Engineering Company and Architecture Engineers (Korea, Republic of)

    1995-12-31

    In order to enhance the reliability of power supply and the quality of electricity, a study on the construction of 22.9 kV distribution test line was performed. The main objective of this study was to establish a construction plan and a basic design to perform the construction and the detailed design of the test line effectively. (author). 21 refs., 45 figs.

  16. CFD Analysis for the Steady State Test of CS28-1 Simulating High Temperature Chemical Reactions in CANDU Fuel Channel

    International Nuclear Information System (INIS)

    Park, Ju Hwan; Kang, Hyung Seok; Rhee, Bo Wook

    2006-05-01

    The establishment of safety analysis system and technology for CANDU reactors has been performed at KAERI. As for one of these researches, single CANDU fuel bundle has been simulated by CATHENA for the post-blowdown event to consider the complicated geometry and heat transfer in the fuel channel. In the previous LBLOCA analysis methodology adopted for Wolsong 2, 3, 4 licensing, the fuel channel blowdown phase was analyzed by a CANDU system analysis code CATHENA and the post-blowdown phase of fuel channel was analyzed by CHAN-IIA code. The use of one computer code in consecutive analyses appeared to be desirable for consistency and simplicity in the safety analysis process. However, validation of the high temperature post-blowdown fuel channel model in the CATHENA before being used in the accident analysis is necessary. Experimental data for the 37-element fuel bundle that fueled CANDU-6 has not been performed. The benchmark problems for the 37-element fuel bundle using CFD code will be compared with the test results of the 28-element fuel bundle in the CS28-1 experiment. A full grid model of FES to the calandria tube simulating the test section was generated. The number of the generated mesh in the grid model was 4,324,340 cells. The boundary and heat source conditions, and properties data in the CFD analysis were given according to the test results and reference data. Thermal hydraulic phenomena in the fuel channel were simulated by a compressible flow, a highly turbulent flow, and a convection/conduction/radiation heat transfer. The natural convection flow of CO 2 due to a large temperature difference in the gap between the pressure and the calandria tubes was treated by Boussinesq's buoyancy model. The CFD results showed good agreement with the test results as a whole. The inner/middle/outer FES temperature distributions of the CFD results showed a small overestimated value of about 30 .deg. C at the entrance region, but good agreement at the outlet region. The

  17. Multigroup Moderation Test in Generalized Structured Component Analysis

    Directory of Open Access Journals (Sweden)

    Angga Dwi Mulyanto

    2016-05-01

    Full Text Available Generalized Structured Component Analysis (GSCA is an alternative method in structural modeling using alternating least squares. GSCA can be used for the complex analysis including multigroup. GSCA can be run with a free software called GeSCA, but in GeSCA there is no multigroup moderation test to compare the effect between groups. In this research we propose to use the T test in PLS for testing moderation Multigroup on GSCA. T test only requires sample size, estimate path coefficient, and standard error of each group that are already available on the output of GeSCA and the formula is simple so the user does not need a long time for analysis.

  18. Determination of Particle Size and Distribution through Image-Based Macroscopic Analysis of the Structure of Biomass Briquettes

    Directory of Open Access Journals (Sweden)

    Veronika Chaloupková

    2018-02-01

    Full Text Available Via image-based macroscopic, analysis of a briquettes’ surface structure, particle size, and distribution was determined to better understand the behavioural pattern of input material during agglomeration in the pressing chamber of a briquetting machine. The briquettes, made of miscanthus, industrial hemp and pine sawdust were produced by a hydraulic piston press. Their structure was visualized by a stereomicroscope equipped with a digital camera and software for image analysis and data measurements. In total, 90 images of surface structure were obtained and quantitatively analysed. Using Nikon Instruments Software (NIS-Elements software, the length and area of 900 particles were measured and statistically tested to compare the size of the particles at different surface locations. Results showed statistically significant differences in particles’ size distribution: larger particles were generally on the front side of briquettes and vice versa, smaller particles were on the rear side. As well, larger particles were centred in the middle of cross sections and the smaller particles were centred on the bottom of the briquette.

  19. Empirically testing the relationship between income distribution, perceived value of money and pay satisfaction

    Directory of Open Access Journals (Sweden)

    Azman Ismail

    2009-07-01

    Full Text Available Compensation management literature highlights that income has three major features: salary, bonus and allowance. If the level and/or amount of income are distributed to employees based on proper rules this may increase pay satisfaction. More importantly, a thorough investigation in this area reveals that the effect of income distribution on pay satisfaction is not consistent if perceived value of money is present in organizations. The nature of this relationship is less emphasized in pay distribution literature. Therefore, this study was conducted to measure the effect of the perceived value of money and income distribution on pay satisfaction using 136 usable questionnaires gathered from employees who have worked in one city based local authority in Sabah, Malaysia (MSLAUTHORITY. Outcomes of hierarchical regression analysis showed that the interaction between perceived value of money and income distribution significantly correlated with pay satisfaction. This result confirms that perceived value of money does act as a moderating variable in the income distribution model of the organizational sample. In addition, discussion and implications of this study are elaborated.

  20. Summary of CPAS EDU Testing Analysis Results

    Science.gov (United States)

    Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose

    2015-01-01

    The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.

  1. Estimation of steady-state and transcient power distributions for the RELAP analyses of the 1963 loss-of-flow and loss-of-pressure tests at BR2

    International Nuclear Information System (INIS)

    Dionne, B.; Tzanos, C.P.

    2011-01-01

    To support the safety analyses required for the conversion of the Belgian Reactor 2 (BR2) from highly-enriched uranium (HEU) to low-enriched uranium (LEU) fuel, the simulation of a number of loss-of-flow tests, with or without loss of pressure, has been undertaken. These tests were performed at BR2 in 1963 and used instrumented fuel assemblies (FAs) with thermocouples (TC) imbedded in the cladding as well as probes to measure the FAs power on the basis of their coolant temperature rise. The availability of experimental data for these tests offers an opportunity to better establish the credibility of the RELAP5-3D model and methodology used in the conversion analysis. In order to support the HEU to LEU conversion safety analyses of the BR2 reactor, RELAP simulations of a number of loss-of-flow/loss-of-pressure tests have been undertaken. Preliminary analyses showed that the conservative power distributions used historically in the BR2 RELAP model resulted in a significant overestimation of the peak cladding temperature during the transient. Therefore, it was concluded that better estimates of the steady-state and decay power distributions were needed to accurately predict the cladding temperatures measured during the tests and establish the credibility of the RELAP model and methodology. The new approach ('best estimate' methodology) uses the MCNP5, ORIGEN-2 and BERYL codes to obtain steady-state and decay power distributions for the BR2 core during the tests A/400/1, C/600/3 and F/400/1. This methodology can be easily extended to simulate any BR2 core configuration. Comparisons with measured peak cladding temperatures showed a much better agreement when power distributions obtained with the new methodology are used.

  2. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  3. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Grum-Grzhimailo, A.N.; Lucchese, R.R.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    A projection method is developed for extracting the nondipole contribution from the molecular frame photoelectron angular distributions of linear molecules. A corresponding convenient parametric form for the angular distributions is derived. The analysis was performed for the N 1s photoionization of the NO molecule a few eV above the ionization threshold. No detectable nondipole contribution was found for the photon energy of 412 eV

  4. CMS distributed analysis infrastructure and operations: experience with the first LHC data

    International Nuclear Information System (INIS)

    Vaandering, E W

    2011-01-01

    The CMS distributed analysis infrastructure represents a heterogeneous pool of resources distributed across several continents. The resources are harnessed using glite and glidein-based work load management systems (WMS). We provide the operational experience of the analysis workflows using CRAB-based servers interfaced with the underlying WMS. The automatized interaction of the server with the WMS provides a successful analysis workflow. We present the operational experience as well as methods used in CMS to analyze the LHC data. The interaction with CMS Run-registry for Run and luminosity block selections via CRAB is discussed. The variations of different workflows during the LHC data-taking period and the lessons drawn from this experience are also outlined.

  5. Innovated feed water distributing system of VVER steam generators

    International Nuclear Information System (INIS)

    Matal, O.; Sousek, P.; Simo, T.; Lehota, M.; Lipka, J.; Slugen, V.

    2000-01-01

    Defects in feed water distributing system due to corrosion-erosion effects have been observed at many VVER 440 steam generators (SG). Therefore analysis of defects origin and consequently design development and testing of a new feed water distributing system were performed. System tests in-situ supported by calculations and comparison of measured and calculated data were focused on demonstration of long term reliable operation, definition of water flow and water chemical characteristics at the SG secondary side and their measurements and study of dynamic characteristics needed for the innovated feed water distributing system seismic features approval. The innovated feed water distributing system was installed in the SGs of two VVER units already. (author)

  6. Pi-Sat: A Low Cost Small Satellite and Distributed Spacecraft Mission System Test Platform

    Science.gov (United States)

    Cudmore, Alan

    2015-01-01

    Current technology and budget trends indicate a shift in satellite architectures from large, expensive single satellite missions, to small, low cost distributed spacecraft missions. At the center of this shift is the SmallSatCubesat architecture. The primary goal of the Pi-Sat project is to create a low cost, and easy to use Distributed Spacecraft Mission (DSM) test bed to facilitate the research and development of next-generation DSM technologies and concepts. This test bed also serves as a realistic software development platform for Small Satellite and Cubesat architectures. The Pi-Sat is based on the popular $35 Raspberry Pi single board computer featuring a 700Mhz ARM processor, 512MB of RAM, a flash memory card, and a wealth of IO options. The Raspberry Pi runs the Linux operating system and can easily run Code 582s Core Flight System flight software architecture. The low cost and high availability of the Raspberry Pi make it an ideal platform for a Distributed Spacecraft Mission and Cubesat software development. The Pi-Sat models currently include a Pi-Sat 1U Cube, a Pi-Sat Wireless Node, and a Pi-Sat Cubesat processor card.The Pi-Sat project takes advantage of many popular trends in the Maker community including low cost electronics, 3d printing, and rapid prototyping in order to provide a realistic platform for flight software testing, training, and technology development. The Pi-Sat has also provided fantastic hands on training opportunities for NASA summer interns and Pathways students.

  7. Distributed activation energy model for kinetic analysis of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.; Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry

    2003-07-01

    Based on the new analysis of distributed activation energy model, a bicentral distribution model was introduced to the analysis of multi-stage hydropyrolysis of coal. The hydropyrolysis for linear temperature programming with and without holding stage were mathematically described and the corresponding kinetic expressions were achieved. Based on the kinetics, the hydropyrolysis (HyPr) and multi-stage hydropyrolysis (MHyPr) of Xundian brown coal was simulated. The results shows that both Mo catalyst and 2-stage holding can lower the apparent activation energy of hydropyrolysis and make activation energy distribution become narrow. Besides, there exists an optimum Mo loading of 0.2% for HyPy of Xundian lignite. 10 refs.

  8. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  9. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  10. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  11. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  12. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  13. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  14. Posttest analysis of the FFTF inherent safety tests

    International Nuclear Information System (INIS)

    Padilla, A. Jr.; Claybrook, S.W.

    1987-01-01

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactor and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code

  15. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  16. Full scale lightning surge tests of distribution transformers and secondary systems

    International Nuclear Information System (INIS)

    Goedde, G.L.; Dugan, R.C. Sr.; Rowe, L.D.

    1992-01-01

    This paper reports that low-side surges are known to cause failures of distribution transformers. They also subject load devices to overvoltages. A full-scale model of a residential service has been set up in a laboratory and subjected to impulses approximating lightning strokes. The tests were made to determine the impulse characteristics of the secondary system and to test the validity of previous analyses. Among the variables investigated were stroke location, the balance of the surges in the service cable, and the effectiveness of arrester protection. Low-side surges were found to consist of two basic components: the natural frequency of the system and the inductive response of the system to the stoke current. The latter component is responsible for transformer failures while the former may be responsible for discharge spots often found around secondary bushings. Arresters at the service entrance are effective in diverting most of the energy from a lightning strike, but may not protect sensitive loads. Additional local protection is also needed. The tests affirmed previous simulations and uncovered additional phenomena as well

  17. An analysis of the orbital distribution of solid rocket motor slag

    Science.gov (United States)

    Horstman, Matthew F.; Mulrooney, Mark

    2009-01-01

    The contribution by solid rocket motors (SRMs) to the orbital debris environment is potentially significant and insufficiently studied. Design and combustion processes can lead to the emission of enough by-products to warrant assessment of their contribution to orbital debris. These particles are formed during SRM tail-off, or burn termination, by the rapid solidification of molten Al2O3 slag accumulated during the burn. The propensity of SRMs to generate particles larger than 100μm raises concerns regarding the debris environment. Sizes as large as 1 cm have been witnessed in ground tests, and comparable sizes have been estimated via observations of sub-orbital tail-off events. Utilizing previous research we have developed more sophisticated size distributions and modeled the time evolution of resultant orbital populations using a historical database of SRM launches, propellant, and likely location and time of tail-off. This analysis indicates that SRM ejecta is a significant component of the debris environment.

  18. Using GIFTS on the Cray-1 for the large coil test facility test: stand design analysis

    International Nuclear Information System (INIS)

    Baudry, T.V.; Gray, W.H.

    1981-06-01

    The GIFTS finite element program has been used extensively throughout the Large Coil Test Facility (LCTF) test stand design analysis. Effective use has been made of GIFTS both as a preprocessor to other finite element programs and as a complete structural analysis package. The LCTF test stand design involved stress analysis ranging from simple textbook-type problems to very complicated three-dimensional structural problems. Two areas of the design analysis are discussed

  19. Analysis and optimization of blood-testing procedures.

    NARCIS (Netherlands)

    Bar-Lev, S.K.; Boxma, O.J.; Perry, D.; Vastazos, L.P.

    2017-01-01

    This paper is devoted to the performance analysis and optimization of blood testing procedures. We present a queueing model of two queues in series, representing the two stages of a blood-testing procedure. Service (testing) in stage 1 is performed in batches, whereas it is done individually in

  20. Acceptance Sampling Plans Based on Truncated Life Tests for Sushila Distribution

    Directory of Open Access Journals (Sweden)

    Amer Ibrahim Al-Omari

    2018-03-01

    Full Text Available An acceptance sampling plan problem based on truncated life tests when the lifetime following a Sushila distribution is considered in this paper. For various acceptance numbers, confidence levels and values of the ratio between fixed experiment time and particular mean lifetime, the minimum sample sizes required to ascertain a specified mean life were found. The operating characteristic function values of the suggested sampling plans and the producer’s risk are presented. Some tables are provided and the results are illustrated by an example of a real data set.

  1. Equilibrium quality and mass flux distributions in an adiabatic three-subchannel test section

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Maganas, A.

    1993-01-01

    An experiment was designed to measure the fully-developed quality and mass flux distributions in an adiabatic three-subchannel test section. The three subchannels had the geometrical characteristics of the corner, side, and interior subchannels of a BWR-5 rod bundle. Data collected with Refrigerant-144 at pressures ranging from 7 to 14 bar, simulating operation with water in the range 55 to 103 bar are reported. The average mass flux and quality in the test section were in the ranges 1300 to 1750 kg/m s and -0.03 to 0.25, respectively. The data are analyzed and presented in various forms

  2. Simulation-Based Testing of Distributed Systems

    National Research Council Canada - National Science Library

    Rutherford, Matthew J; Carzaniga, Antonio; Wolf, Alexander L

    2006-01-01

    .... Typically written using an imperative programming language, these simulations capture basic algorithmic functionality at the same time as they focus attention on properties critical to distribution...

  3. A realistic analysis of the phonon growth characteristics in a degenerate semiconductor using a simplified model of Fermi-Dirac distribution

    Science.gov (United States)

    Basu, A.; Das, B.; Middya, T. R.; Bhattacharya, D. P.

    2017-01-01

    The phonon growth characteristic in a degenerate semiconductor has been calculated under the condition of low temperature. If the lattice temperature is high, the energy of the intravalley acoustic phonon is negligibly small compared to the average thermal energy of the electrons. Hence one can traditionally assume the electron-phonon collisions to be elastic and approximate the Bose-Einstein (B.E.) distribution for the phonons by the simple equipartition law. However, in the present analysis at the low lattice temperatures, the interaction of the non equilibrium electrons with the acoustic phonons becomes inelastic and the simple equipartition law for the phonon distribution is not valid. Hence the analysis is made taking into account the inelastic collisions and the complete form of the B.E. distribution. The high-field distribution function of the carriers given by Fermi-Dirac (F.D.) function at the field dependent carrier temperature, has been approximated by a well tested model that apparently overcomes the intrinsic problem of correct evaluation of the integrals involving the product and powers of the Fermi function. Hence the results thus obtained are more reliable compared to the rough estimation that one may obtain from using the exact F.D. function, but taking recourse to some over simplified approximations.

  4. Plant management tools tested with a small-scale distributed generation laboratory

    International Nuclear Information System (INIS)

    Ferrari, Mario L.; Traverso, Alberto; Pascenti, Matteo; Massardo, Aristide F.

    2014-01-01

    Highlights: • Thermal grid innovative layouts. • Experimental rig for distributed generation. • Real-time management tool. • Experimental results for plant management. • Comparison with results from an optimization complete software. - Abstract: Optimization of power generation with smart grids is an important issue for extensive sustainable development of distributed generation. Since an experimental approach is essential for implementing validated optimization software, the TPG research team of the University of Genoa has installed a laboratory facility for carrying out studies on polygeneration grids. The facility consists of two co-generation prime movers based on conventional technology: a 100 kWe gas turbine (mGT) and a 20 kWe internal combustion engine (ICE). The rig high flexibility allows the possibility of integration with renewable-source based devices, such as biomass-fed boilers and solar panels. Special attention was devoted to thermal distribution grid design. To ensure the possibility of application in medium-large districts, composed of several buildings including energy users, generators or both, an innovative layout based on two ring pipes was examined. Thermal storage devices were also included in order to have a complete hardware platform suitable for assessing the performance of different management tools. The test presented in this paper was carried out with both the mGT and the ICE connected to this innovative thermal grid, while users were emulated by means of fan coolers controlled by inverters. During this test the plant is controlled by a real-time model capable of calculating a machine performance ranking, which is necessary in order to split power demands between the prime movers (marginal cost decrease objective). A complete optimization tool devised by TPG (ECoMP program) was also used in order to obtain theoretical results considering the same machines and load values. The data obtained with ECoMP were compared with the

  5. A Distributed Multi-dimensional SOLAP Model of Remote Sensing Data and Its Application in Drought Analysis

    Directory of Open Access Journals (Sweden)

    LI Jiyuan

    2014-06-01

    Full Text Available SOLAP (Spatial On-Line Analytical Processing has been applied to multi-dimensional analysis of remote sensing data recently. However, its computation performance faces a considerable challenge from the large-scale dataset. A geo-raster cube model extended by Map-Reduce is proposed, which refers to the application of Map-Reduce (a data-intensive computing paradigm in the OLAP field. In this model, the existing methods are modified to adapt to distributed environment based on the multi-level raster tiles. Then the multi-dimensional map algebra is introduced to decompose the SOLAP computation into multiple distributed parallel map algebra functions on tiles under the support of Map-Reduce. The drought monitoring by remote sensing data is employed as a case study to illustrate the model construction and application. The prototype is also implemented, and the performance testing shows the efficiency and scalability of this model.

  6. 21 CFR 809.40 - Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs...

    Science.gov (United States)

    2010-04-01

    ... OTC test sample collection systems for drugs of abuse testing. 809.40 Section 809.40 Food and Drugs... Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs of abuse testing. (a) Over-the-counter (OTC) test sample collection systems for drugs of abuse testing (§ 864.3260...

  7. Data Link Test and Analysis System/ATCRBS Transponder Test System Technical Reference

    Science.gov (United States)

    1990-05-01

    This document references material for personnel using or making software changes : to the Data Link Test and Analysis System (DATAS) for Air Traffic Control Radar : Beacon System (ATCRBS) transponder testing and data collection. This is one of : a se...

  8. Development of neural network for analysis of local power distributions in BWR fuel bundles

    International Nuclear Information System (INIS)

    Tanabe, Akira; Yamamoto, Toru; Shinfuku, Kimihiro; Nakamae, Takuji.

    1993-01-01

    A neural network model has been developed to learn the local power distributions in a BWR fuel bundle. A two layers neural network with total 128 elements is used for this model. The neural network learns 33 cases of local power peaking factors of fuel rods with given enrichment distribution as the teacher signals, which were calculated by a fuel bundle nuclear analysis code based on precise physical models. This neural network model studied well the teacher signals within 1 % error. It is also able to calculate the local power distributions within several % error for the different enrichment distributions from the teacher signals when the average enrichment is close to 2 %. This neural network is simple and the computing speed of this model is 300 times faster than that of the precise nuclear analysis code. This model was applied to survey the enrichment distribution to meet a target local power distribution in a fuel bundle, and the enrichment distribution with flat power shape are obtained within short computing time. (author)

  9. Analysis of Mesh Distribution Systems Considering Load Models and Load Growth Impact with Loops on System Performance

    Science.gov (United States)

    Kumar Sharma, A.; Murty, V. V. S. N.

    2014-12-01

    The distribution system is the final link between bulk power system and consumer end. A distinctive load flow solution method is used for analysis of the load flow of radial and weakly meshed network based on Kirchhoff's Current Law (KCL) and KVL. This method has excellent convergence characteristics for both radial as well as weakly meshed structure and is based on bus injection to branch current and branch-current to bus-voltage matrix. The main contribution of the paper is: (i) an analysis has been carried out for a weekly mesh network considering number of loops addition and its impact on the losses, kW and kVAr requirements from a system, and voltage profile, (ii) different load models, realistic ZIP load model and load growth impact on losses, voltage profile, kVA and kVAr requirements, (iii) impact of addition of loops on losses, voltage profile, kVA and kVAr requirements from substation, and (iv) comparison of system performance with radial distribution system. Voltage stability is a major concern in planning and operation of power systems. This paper also includes identifying the closeness critical bus which is the most sensitive to the voltage collapse in radial distribution networks. Node having minimum value of voltage stability index is the most sensitive node. Voltage stability index values are computed for meshed network with number of loops added in the system. The results have been obtained for IEEE 33 and 69 bus test system. The results have also been obtained for radial distribution system for comparison.

  10. Research on intelligent power distribution system for spacecraft

    Science.gov (United States)

    Xia, Xiaodong; Wu, Jianju

    2017-10-01

    The power distribution system (PDS) mainly realizes the power distribution and management of the electrical load of the whole spacecraft, which is directly related to the success or failure of the mission, and hence is an important part of the spacecraft. In order to improve the reliability and intelligent degree of the PDS, and considering the function and composition of spacecraft power distribution system, this paper systematically expounds the design principle and method of the intelligent power distribution system based on SSPC, and provides the analysis and verification of the test data additionally.

  11. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  12. Decomposition and Projection Methods for Distributed Robustness Analysis of Interconnected Uncertain Systems

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2013-01-01

    We consider a class of convex feasibility problems where the constraints that describe the feasible set are loosely coupled. These problems arise in robust stability analysis of large, weakly interconnected uncertain systems. To facilitate distributed implementation of robust stability analysis o...

  13. Flow analysis of HANARO flow simulated test facility

    International Nuclear Information System (INIS)

    Park, Yong-Chul; Cho, Yeong-Garp; Wu, Jong-Sub; Jun, Byung-Jin

    2002-01-01

    The HANARO, a multi-purpose research reactor of 30 MWth open-tank-in-pool type, has been under normal operation since its initial critical in February, 1995. Many experiments should be safely performed to activate the utilization of the NANARO. A flow simulated test facility is being developed for the endurance test of reactivity control units for extended life times and the verification of structural integrity of those experimental facilities prior to loading in the HANARO. This test facility is composed of three major parts; a half-core structure assembly, flow circulation system and support system. The half-core structure assembly is composed of plenum, grid plate, core channel with flow tubes, chimney and dummy pool. The flow channels are to be filled with flow orifices to simulate core channels. This test facility must simulate similar flow characteristics to the HANARO. This paper, therefore, describes an analytical analysis to study the flow behavior of the test facility. The computational flow analysis has been performed for the verification of flow structure and similarity of this test facility assuming that flow rates and pressure differences of the core channel are constant. The shapes of flow orifices were determined by the trial and error method based on the design requirements of core channel. The computer analysis program with standard k - ε turbulence model was applied to three-dimensional analysis. The results of flow simulation showed a similar flow characteristic with that of the HANARO and satisfied the design requirements of this test facility. The shape of flow orifices used in this numerical simulation can be adapted for manufacturing requirements. The flow rate and the pressure difference through core channel proved by this simulation can be used as the design requirements of the flow system. The analysis results will be verified with the results of the flow test after construction of the flow system. (author)

  14. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  15. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Lucchese, R.R.; Montuoro, R.; Grum-Grzhimailo, A.N.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    The analysis of the molecular-frame photoelectron angular distributions (MFPADs) is discussed within the dipole approximation. The general expressions are reviewed and strategies for extracting the maximum amount of information from different types of experimental measurements are considered. The analysis of the N 1s photoionization of NO is given to illustrate the method

  16. Nonlinear Analysis and Preliminary Testing Results of a Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.; Wu, Hsi-Yung T.

    2015-01-01

    A large test article was recently designed, analyzed, fabricated, and successfully tested up to the representative design ultimate loads to demonstrate that stiffened composite panels with through-the-thickness reinforcement are a viable option for the next generation large transport category aircraft, including non-conventional configurations such as the hybrid wing body. This paper focuses on finite element analysis and test data correlation of the hybrid wing body center section test article under mechanical, pressure and combined load conditions. Good agreement between predictive nonlinear finite element analysis and test data is found. Results indicate that a geometrically nonlinear analysis is needed to accurately capture the behavior of the non-circular pressurized and highly-stressed structure when the design approach permits local buckling.

  17. Landward Distribution of Wave Overtopping for Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Burcharth, Hans F.

    2006-01-01

    Overtopping data from seven model test projects has been analyzed with respect to the landward spatial distribution of the overtopping discharge. In total more than 1000 overtopping tests have been analysed and a formula derived for prediction of the landward distribution of overtopping behind...... rubble mound structures with a super structure. The analysis led to the conclusion that although the overtopping discharge, for identical wave heights, decreases with increasing wave steepness, then the maximum travel distance increases with increasing wave steepness....

  18. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    Science.gov (United States)

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  19. Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions

    2017-08-09

    The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.

  20. Adaptive linear rank tests for eQTL studies.

    Science.gov (United States)

    Szymczak, Silke; Scheinhardt, Markus O; Zeller, Tanja; Wild, Philipp S; Blankenberg, Stefan; Ziegler, Andreas

    2013-02-10

    Expression quantitative trait loci (eQTL) studies are performed to identify single-nucleotide polymorphisms that modify average expression values of genes, proteins, or metabolites, depending on the genotype. As expression values are often not normally distributed, statistical methods for eQTL studies should be valid and powerful in these situations. Adaptive tests are promising alternatives to standard approaches, such as the analysis of variance or the Kruskal-Wallis test. In a two-stage procedure, skewness and tail length of the distributions are estimated and used to select one of several linear rank tests. In this study, we compare two adaptive tests that were proposed in the literature using extensive Monte Carlo simulations of a wide range of different symmetric and skewed distributions. We derive a new adaptive test that combines the advantages of both literature-based approaches. The new test does not require the user to specify a distribution. It is slightly less powerful than the locally most powerful rank test for the correct distribution and at least as powerful as the maximin efficiency robust rank test. We illustrate the application of all tests using two examples from different eQTL studies. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  2. A computational approach to discovering the functions of bacterial phytochromes by analysis of homolog distributions

    Directory of Open Access Journals (Sweden)

    Lamparter Tilman

    2006-03-01

    Full Text Available Abstract Background Phytochromes are photoreceptors, discovered in plants, that control a wide variety of developmental processes. They have also been found in bacteria and fungi, but for many species their biological role remains obscure. This work concentrates on the phytochrome system of Agrobacterium tumefaciens, a non-photosynthetic soil bacterium with two phytochromes. To identify proteins that might share common functions with phytochromes, a co-distribution analysis was performed on the basis of protein sequences from 138 bacteria. Results A database of protein sequences from 138 bacteria was generated. Each sequence was BLASTed against the entire database. The homolog distribution of each query protein was then compared with the homolog distribution of every other protein (target protein of the same species, and the target proteins were sorted according to their probability of co-distribution under random conditions. As query proteins, phytochromes from Agrobacterium tumefaciens, Pseudomonas aeruginosa, Deinococcus radiodurans and Synechocystis PCC 6803 were chosen along with several phytochrome-related proteins from A. tumefaciens. The Synechocystis photosynthesis protein D1 was selected as a control. In the D1 analyses, the ratio between photosynthesis-related proteins and those not related to photosynthesis among the top 150 in the co-distribution tables was > 3:1, showing that the method is appropriate for finding partner proteins with common functions. The co-distribution of phytochromes with other histidine kinases was remarkably high, although most co-distributed histidine kinases were not direct BLAST homologs of the query protein. This finding implies that phytochromes and other histidine kinases share common functions as parts of signalling networks. All phytochromes tested, with one exception, also revealed a remarkably high co-distribution with glutamate synthase and methionine synthase. This result implies a general role of

  3. 78 FR 72534 - Policy Statement on the Principles for Development and Distribution of Annual Stress Test Scenarios

    Science.gov (United States)

    2013-12-03

    ... FEDERAL DEPOSIT INSURANCE CORPORATION 12 CFR Part 325 Policy Statement on the Principles for... stress test horizon. The variables specified for each scenario generally address economic activity, asset..., 2012, that articulated the principles the FDIC will apply to develop and distribute the stress test...

  4. Distributed temperature sensor testing in liquid sodium

    Energy Technology Data Exchange (ETDEWEB)

    Gerardi, Craig, E-mail: cgerardi@anl.gov; Bremer, Nathan; Lisowski, Darius; Lomperski, Stephen

    2017-02-15

    Highlights: • Distributed temperature sensors measured high-resolution liquid-sodium temperatures. • DTSs worked well up to 400 °C. • A single DTS simultaneously detected sodium level and temperature. - Abstract: Rayleigh-backscatter-based distributed fiber optic sensors were immersed in sodium to obtain high-resolution liquid-sodium temperature measurements. Distributed temperature sensors (DTSs) functioned well up to 400 °C in a liquid sodium environment. The DTSs measured sodium column temperature and the temperature of a complex geometrical pattern that leveraged the flexibility of fiber optics. A single Ø 360 μm OD sensor registered dozens of temperatures along a length of over one meter at 100 Hz. We also demonstrated the capability to use a single DTS to simultaneously detect thermal interfaces (e.g. sodium level) and measure temperature.

  5. Advances in the analysis of pressure interference tests

    Energy Technology Data Exchange (ETDEWEB)

    Martinez R, N. [Petroleos Mexicanos, PEMEX, Mexico City (Mexico); Samaniego V, F. [Univ. Nacional Autonoma de Mexico (Mexico)

    2010-12-15

    This paper presented an extension for radial, linear, and spherical flow conditions of the El-Khatib method for analyzing pressure interference tests through utilization of the pressure derivative. Conventional analysis of interference tests considers only radial flow, but some reservoirs have physical field conditions in which linear or spherical flow conditions prevail. The INTERFERAN system, a friendly computer code for the automatic analysis of pressure interference tests, was also discussed and demonstrated by way of 2 field cases. INTERFERAN relies on the principle of superposition in time and space to interpret a test of several wells with variable histories of production or injection or both. The first field case addressed interference tests conducted in the naturally fractured geothermal field of Klamath Falls, and the second field case was conducted in a river-formed bed in which linear flow conditions are dominant. The analysis was deemed to be reliable. 13 refs., 1 tab., 7 figs.

  6. High Level Analysis, Design and Validation of Distributed Mobile Systems with CoreASM

    Science.gov (United States)

    Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.

    System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.

  7. Real-Time Analysis and Forecasting of Multisite River Flow Using a Distributed Hydrological Model

    Directory of Open Access Journals (Sweden)

    Mingdong Sun

    2014-01-01

    Full Text Available A spatial distributed hydrological forecasting system was developed to promote the analysis of river flow dynamic state in a large basin. The research presented the real-time analysis and forecasting of multisite river flow in the Nakdong River Basin using a distributed hydrological model with radar rainfall forecast data. A real-time calibration algorithm of hydrological distributed model was proposed to investigate the particular relationship between the water storage and basin discharge. Demonstrate the approach of simulating multisite river flow using a distributed hydrological model couple with real-time calibration and forecasting of multisite river flow with radar rainfall forecasts data. The hydrographs and results exhibit that calibrated flow simulations are very approximate to the flow observation at all sites and the accuracy of forecasting flow is gradually decreased with lead times extending from 1 hr to 3 hrs. The flow forecasts are lower than the flow observation which is likely caused by the low estimation of radar rainfall forecasts. The research has well demonstrated that the distributed hydrological model is readily applicable for multisite real-time river flow analysis and forecasting in a large basin.

  8. NASA Langley Distributed Propulsion VTOL Tilt-Wing Aircraft Testing, Modeling, Simulation, Control, and Flight Test Development

    Science.gov (United States)

    Rothhaar, Paul M.; Murphy, Patrick C.; Bacon, Barton J.; Gregory, Irene M.; Grauer, Jared A.; Busan, Ronald C.; Croom, Mark A.

    2014-01-01

    Control of complex Vertical Take-Off and Landing (VTOL) aircraft traversing from hovering to wing born flight mode and back poses notoriously difficult modeling, simulation, control, and flight-testing challenges. This paper provides an overview of the techniques and advances required to develop the GL-10 tilt-wing, tilt-tail, long endurance, VTOL aircraft control system. The GL-10 prototype's unusual and complex configuration requires application of state-of-the-art techniques and some significant advances in wind tunnel infrastructure automation, efficient Design Of Experiments (DOE) tunnel test techniques, modeling, multi-body equations of motion, multi-body actuator models, simulation, control algorithm design, and flight test avionics, testing, and analysis. The following compendium surveys key disciplines required to develop an effective control system for this challenging vehicle in this on-going effort.

  9. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  10. Testing of the derivative method and Kruskal-Wallis technique for sensitivity analysis of SYVAC

    International Nuclear Information System (INIS)

    Prust, J.O.; Edwards, H.H.

    1985-04-01

    The Kruskal-Wallis method of one-way analysis of variance by ranks has proved successful in identifying input parameters which have an important influence on dose. This technique was extended to test for first order interactions between parameters. In view of a number of practical difficulties and the computing resources required to carry out a large number of runs, this test is not recommended for detecting interactions between parameters. The derivative method of sensitivity analysis examines the partial derivative values of each input parameter with dose at various points across the parameter range. Important input parameters are associated with high derivatives and the results agreed well with previous sensitivity studies. The derivative values also provided information on the data generation distributions to be used for the input parameters in order to concentrate sampling in the high dose region of the parameter space to improve the sampling efficiency. Furthermore, the derivative values provided information on parameter interactions, the feasibility of developing a high dose algorithm and formed the basis for developing a regression equation. (author)

  11. Linear-rank testing of a non-binary, responder-analysis, efficacy score to evaluate pharmacotherapies for substance use disorders.

    Science.gov (United States)

    Holmes, Tyson H; Li, Shou-Hua; McCann, David J

    2016-11-23

    The design of pharmacological trials for management of substance use disorders is shifting toward outcomes of successful individual-level behavior (abstinence or no heavy use). While binary success/failure analyses are common, McCann and Li (CNS Neurosci Ther 2012; 18: 414-418) introduced "number of beyond-threshold weeks of success" (NOBWOS) scores to avoid dichotomized outcomes. NOBWOS scoring employs an efficacy "hurdle" with values reflecting duration of success. Here, we evaluate NOBWOS scores rigorously. Formal analysis of mathematical structure of NOBWOS scores is followed by simulation studies spanning diverse conditions to assess operating characteristics of five linear-rank tests on NOBWOS scores. Simulations include assessment of Fisher's exact test applied to hurdle component. On average, statistical power was approximately equal for five linear-rank tests. Under none of conditions examined did Fisher's exact test exhibit greater statistical power than any of the linear-rank tests. These linear-rank tests provide good Type I and Type II error control for comparing distributions of NOBWOS scores between groups (e.g. active vs. placebo). All methods were applied to re-analyses of data from four clinical trials of differing lengths and substances of abuse. These linear-rank tests agreed across all trials in rejecting (or not) their null (equality of distributions) at ≤ 0.05. © The Author(s) 2016.

  12. Simplified distributed parameters BWR dynamic model for transient and stability analysis

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Nunez-Carrera, Alejandro; Vazquez-Rodriguez, Alejandro

    2006-01-01

    This paper describes a simplified model to perform transient and linear stability analysis for a typical boiling water reactor (BWR). The simplified transient model was based in lumped and distributed parameters approximations, which includes vessel dome and the downcomer, recirculation loops, neutron process, fuel pin temperature distribution, lower and upper plenums reactor core and pressure and level controls. The stability was determined by studying the linearized versions of the equations representing the BWR system in the frequency domain. Numerical examples are used to illustrate the wide application of the simplified BWR model. We concluded that this simplified model describes properly the dynamic of a BWR and can be used for safety analysis or as a first approach in the design of an advanced BWR

  13. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    Science.gov (United States)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  14. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  15. Synchronous Design and Test of Distributed Passive Radar Systems Based on Digital Broadcasting and Television

    Directory of Open Access Journals (Sweden)

    Wan Xianrong

    2017-02-01

    Full Text Available Digital broadcasting and television are important classes of illuminators of opportunity for passive radars. Distributed and multistatic structure are the development trends for passive radars. Most modern digital broadcasting and television systems work on a network, which not only provides a natural condition to distributed passive radar but also puts forward higher requirements on the design of passive radar systems. Among those requirements, precise synchronization among the receivers and transmitters as well as among multiple receiving stations, which mainly involves frequency and time synchronization, is the first to be solved. To satisfy the synchronization requirements of distributed passive radars, a synchronization scheme based on GPS is presented in this paper. Moreover, an effective scheme based on the China Mobile Multimedia Broadcasting signal is proposed to test the system synchronization performance. Finally, the reliability of the synchronization design is verified via the distributed multistatic passive radar experiments.

  16. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  17. Bell Test over Extremely High-Loss Channels: Towards Distributing Entangled Photon Pairs between Earth and the Moon

    Science.gov (United States)

    Cao, Yuan; Li, Yu-Huai; Zou, Wen-Jie; Li, Zheng-Ping; Shen, Qi; Liao, Sheng-Kai; Ren, Ji-Gang; Yin, Juan; Chen, Yu-Ao; Peng, Cheng-Zhi; Pan, Jian-Wei

    2018-04-01

    Quantum entanglement was termed "spooky action at a distance" in the well-known paper by Einstein, Podolsky, and Rosen. Entanglement is expected to be distributed over longer and longer distances in both practical applications and fundamental research into the principles of nature. Here, we present a proposal for distributing entangled photon pairs between Earth and the Moon using a Lagrangian point at a distance of 1.28 light seconds. One of the most fascinating features in this long-distance distribution of entanglement is as follows. One can perform the Bell test with human supplying the random measurement settings and recording the results while still maintaining spacelike intervals. To realize a proof-of-principle experiment, we develop an entangled photon source with 1 GHz generation rate, about 2 orders of magnitude higher than previous results. Violation of Bell's inequality was observed under a total simulated loss of 103 dB with measurement settings chosen by two experimenters. This demonstrates the feasibility of such long-distance Bell test over extremely high-loss channels, paving the way for one of the ultimate tests of the foundations of quantum mechanics.

  18. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  19. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  20. Westinghouse-Gothic comparisons with passive containment cooling tests using a one-to-ten-scale test facility

    International Nuclear Information System (INIS)

    Kennedy, M.D.; Woodcock, J.; Wright, R.F.; Gresham, J.A.

    1996-01-01

    The Heavy Water Reactor Facility is equipped with a passive cooling system to provide long-term decay heat removal during postulated beyond-design-basis accidents. The passive containment cooling system (PCCS) consists of an annular space between the steel containment vessel and the concrete shield building and optimized inlet and chimney designs. The design, analysis, and regulatory acceptance of a plant with PCCS requires an understanding of the external convective and radiative heat transfer phenomena, as well as the internal distributions of noncondensable gases. The internal distribution of noncondensable gases has a strong effect on the resistance to condensation heat transfer and therefore affects the wall temperature distribution applied to the external channel. To evaluate these phenomena, a test facility having a scale of approximately one to ten, known as the large-scale test, was constructed, and several series of tests were performed. Test results have been used to validate the Westinghouse-GOTHIC (WGOTHIC) computer code. A comparison of WGOTHIC predictions and test results has been completed. This paper shows that mixed-convection models applied to the interior and exterior surfaces as well as a heat and mass transfer analogy for internal condensation provides good comparison to test results. An axial distribution of noncondensables within the test vessel is also predicted

  1. Multivariate data analysis as a semi-quantitative tool for interpretive evaluation of comparability or equivalence of aerodynamic particle size distribution profiles.

    Science.gov (United States)

    Shi, Shuai; Hickey, Anthony J

    2009-01-01

    The purpose of this article is to investigate the performance of multivariate data analysis, especially orthogonal partial least square (OPLS) analysis, as a semi-quantitative tool to evaluate the comparability or equivalence of aerodynamic particle size distribution (APSD) profiles of orally inhaled and nasal drug products (OINDP). Monte Carlo simulation was employed to reconstitute APSD profiles based on 55 realistic scenarios proposed by the Product Quality Research Institute (PQRI) working group. OPLS analyses with different data pretreatment methods were performed on each of the reconstituted profiles. Compared to unit-variance scaling, equivalence determined based on OPLS analysis with Pareto scaling was shown to be more consistent with the working group assessment. Chi-square statistics was employed to compare the performance of OPLS analysis (Pareto scaling) with that of the combination test (i.e., chi-square ratio statistics and population bioequivalence test for impactor-sized mass) in terms of achieving greater consistency with the working group evaluation. A p value of 0.036 suggested that OPLS analysis with Pareto scaling may be more predictive than the combination test with respect to consistency. Furthermore, OPLS analysis may also be employed to analyze part of the APSD profiles that contribute to the calculation of the mass median aerodynamic diameter. Our results show that OPLS analysis performed on partial deposition sites do not interfere with the performance on all deposition sites.

  2. Non-parametric comparison of histogrammed two-dimensional data distributions using the Energy Test

    International Nuclear Information System (INIS)

    Reid, Ivan D; Lopes, Raul H C; Hobson, Peter R

    2012-01-01

    When monitoring complex experiments, comparison is often made between regularly acquired histograms of data and reference histograms which represent the ideal state of the equipment. With the larger HEP experiments now ramping up, there is a need for automation of this task since the volume of comparisons could overwhelm human operators. However, the two-dimensional histogram comparison tools available in ROOT have been noted in the past to exhibit shortcomings. We discuss a newer comparison test for two-dimensional histograms, based on the Energy Test of Aslan and Zech, which provides more conclusive discrimination between histograms of data coming from different distributions than methods provided in a recent ROOT release.

  3. Equilibrium quality and mass flux distributions in an adiabatic three-subchannel test section

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Maganas, A.

    1995-01-01

    An experiment was designed to measure the fully developed quality and mass flux distributions in an adiabatic three-subchannel test section. The three subchannels had the geometrical characteristics of the corner, side, and interior subchannels of a boiling water reactor (BWR-5) rod bundle. Data collected with Refrigerant-114 at pressures ranging from 7 to 14 bars, simulating operation with water in the range 55 to 103 bars are reported. The average mass flux and quality in the test section were in the ranges 1,300 to 1,750 kg/m 2 · s and -0.03 to 0.25, respectively. The data are analyzed and presented in various forms

  4. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  5. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  6. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  7. Experimental evaluation of wall Mach number distributions of the octagonal test section proposed for NASA Lewis Research Center's altitude wind tunnel

    Science.gov (United States)

    Harrington, Douglas E.; Burley, Richard R.; Corban, Robert R.

    1986-01-01

    Wall Mach number distributions were determined over a range of test-section free-stream Mach numbers from 0.2 to 0.92. The test section was slotted and had a nominal porosity of 11 percent. Reentry flaps located at the test-section exit were varied from 0 (fully closed) to 9 (fully open) degrees. Flow was bled through the test-section slots by means of a plenum evacuation system (PES) and varied from 0 to 3 percent of tunnel flow. Variations in reentry flap angle or PES flow rate had little or no effect on the Mach number distributions in the first 70 percent of the test section. However, in the aft region of the test section, flap angle and PES flow rate had a major impact on the Mach number distributions. Optimum PES flow rates were nominally 2 to 2.5 percent wtih the flaps fully closed and less than 1 percent when the flaps were fully open. The standard deviation of the test-section wall Mach numbers at the optimum PES flow rates was 0.003 or less.

  8. Performance Analysis of the Consensus-Based Distributed LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Gonzalo Mateos

    2009-01-01

    Full Text Available Low-cost estimation of stationary signals and reduced-complexity tracking of nonstationary processes are well motivated tasks than can be accomplished using ad hoc wireless sensor networks (WSNs. To this end, a fully distributed least mean-square (D-LMS algorithm is developed in this paper, in which sensors exchange messages with single-hop neighbors to consent on the network-wide estimates adaptively. The novel approach does not require a Hamiltonian cycle or a special bridge subset of sensors, while communications among sensors are allowed to be noisy. A mean-square error (MSE performance analysis of D-LMS is conducted in the presence of a time-varying parameter vector, which adheres to a first-order autoregressive model. For sensor observations that are related to the parameter vector of interest via a linear Gaussian model and after adopting simplifying independence assumptions, exact closed-form expressions are derived for the global and sensor-level MSE evolution as well as its steady-state (s.s. values. Mean and MSE-sense stability of D-LMS are also established. Interestingly, extensive numerical tests demonstrate that for small step-sizes the results accurately extend to the pragmatic setting whereby sensors acquire temporally correlated, not necessarily Gaussian data.

  9. A shift from significance test to hypothesis test through power analysis in medical research.

    Science.gov (United States)

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  10. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    Science.gov (United States)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  11. Quantitative analysis of the fission product distribution in a damaged fuel assembly using gamma-spectrometry and computed tomography for the Phébus FPT3 test

    International Nuclear Information System (INIS)

    Biard, B.

    2013-01-01

    (locate and identify the materials and estimate their density with the X-ray tomograms, locate the FP distribution inside the bundle with the gamma emission tomograms) and to automate the processing of the gamma spectra acquired. The specificities of these gamma spectra (high count rate, number of gamma rays, number of measurements, etc.) required in particular to analyse key lines only and needed an original counting loss correction. The method was validated over the pre-test examination of the fuel bundle, through a comparison with the classical gamma analysis method used at the laboratory for objects of known geometry. The final results, given with acceptable uncertainties, gave for all FPs identified (mainly 137 Cs, 131 I, 132 Te, 140 Ba, 95 Zr, 103 Ru, etc.) their quantitative activity profile along the bundle, their retained and released fractions in the bundle, and also some information about their relocation inside the bundle. The results are in very good agreement with other Phébus FPT3 measurements and inventory calculations

  12. Quantitative analysis of the fission product distribution in a damaged fuel assembly using gamma-spectrometry and computed tomography for the Phébus FPT3 test

    Energy Technology Data Exchange (ETDEWEB)

    Biard, B., E-mail: bruno.biard@irsn.fr

    2013-09-15

    (locate and identify the materials and estimate their density with the X-ray tomograms, locate the FP distribution inside the bundle with the gamma emission tomograms) and to automate the processing of the gamma spectra acquired. The specificities of these gamma spectra (high count rate, number of gamma rays, number of measurements, etc.) required in particular to analyse key lines only and needed an original counting loss correction. The method was validated over the pre-test examination of the fuel bundle, through a comparison with the classical gamma analysis method used at the laboratory for objects of known geometry. The final results, given with acceptable uncertainties, gave for all FPs identified (mainly {sup 137}Cs, {sup 131}I, {sup 132}Te, {sup 140}Ba, {sup 95}Zr, {sup 103}Ru, etc.) their quantitative activity profile along the bundle, their retained and released fractions in the bundle, and also some information about their relocation inside the bundle. The results are in very good agreement with other Phébus FPT3 measurements and inventory calculations.

  13. AspectKE*:Security Aspects with Program Analysis for Distributed Systems

    DEFF Research Database (Denmark)

    2010-01-01

    AspectKE* is the first distributed AOP language based on a tuple space system. It is designed to enforce security policies to applications containing untrusted processes. One of the key features is the high-level predicates that extract results of static program analysis. These predicates provide...

  14. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  15. 3D DEM simulation and analysis of void fraction distribution in a pebble bed high temperature reactor

    International Nuclear Information System (INIS)

    Yang, Xingtuan; Gui, Nan; Tu, Jiyuan; Jiang, Shengyao

    2014-01-01

    Highlights: • We show a detailed analysis of void fraction (VF) in HTR-10 of China using DEM. • Radial distribution (RD) of VF is uniform in the core and oscillated near the wall. • Axial distribution (AD) is linearly varied along height due to effect of gravity. • Steady RD of VF in the conical base is Gaussian-like, larger than packing bed. • Joint linear and normal distribution of VF is analyzed and explained. - Abstract: The current work analyzes the radial and axial distributions of void fraction of a pebble bed high temperature reactor. A three-dimensional pebble bed corresponding to our test facility of pebble bed type gas-cooled high temperature reactor (HTR-10) in Tsinghua University is simulated via discrete element method, and the radial and axial void fraction profiles are calculated. It validates the oscillating characteristics of radial void fraction near the wall. Detailed calculations show the differences of void fraction profiles between the stationary packing bed and the dynamically discharging bed. Based on the vertically and circumferentially averaged radial distribution and horizontally averaged axial distribution of void fraction, a fully three-dimensional analytical distribution of void fraction throughout the bed is established. The results show the combined effects of gravity and void variation in the pebble bed caused by the pebble discharging. It indicates the linearly increased packing effect caused by gravity in the vertical (axial) direction and the normal distribution of void in the horizontal (radial) direction by pebble drainage. These two effects coexist in the conical base of the bed whereas only the former effect exists in the cylindrical volume of the bed

  16. Best Statistical Distribution of flood variables for Johor River in Malaysia

    Science.gov (United States)

    Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.

    2012-12-01

    A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow

  17. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  18. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  19. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    Science.gov (United States)

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  20. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  1. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  2. Field tests applying multi-agent technology for distributed control. Virtual power plants and wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Schaeffer, G.J.; Warmer, C.J.; Hommelberg, M.P.F.; Kamphuis, I.G.; Kok, J.K. [Energy in the Built Environment and Networks, Petten (Netherlands)

    2007-01-15

    Multi-agent technology is state of the art ICT. It is not yet widely applied in power control systems. However, it has a large potential for bottom-up, distributed control of a network with large-scale renewable energy sources (RES) and distributed energy resources (DER) in future power systems. At least two major European R and D projects (MicroGrids and CRISP) have investigated its potential. Both grid-related as well as market-related applications have been studied. This paper will focus on two field tests, performed in the Netherlands, applying multi-agent control by means of the PowerMatcher concept. The first field test focuses on the application of multi-agent technology in a commercial setting, i.e. by reducing the need for balancing power in the case of intermittent energy sources, such as wind energy. In this case the flexibility is used of demand and supply of industrial and residential consumers and producers. Imbalance reduction rates of over 40% have been achieved applying the PowerMatcher, and with a proper portfolio even larger rates are expected. In the second field test the multi-agent technology is used in the design and implementation of a virtual power plant (VPP). This VPP digitally connects a number of micro-CHP units, installed in residential dwellings, into a cluster that is controlled to reduce the local peak demand of the common low-voltage grid segment the micro-CHP units are connected to. In this way the VPP supports the local distribution system operator (DSO) to defer reinforcements in the grid infrastructure (substations and cables)

  3. Field tests applying multi-agent technology for distributed control. Virtual power plants and wind energy

    International Nuclear Information System (INIS)

    Schaeffer, G.J.; Warmer, C.J.; Hommelberg, M.P.F.; Kamphuis, I.G.; Kok, J.K.

    2007-01-01

    Multi-agent technology is state of the art ICT. It is not yet widely applied in power control systems. However, it has a large potential for bottom-up, distributed control of a network with large-scale renewable energy sources (RES) and distributed energy resources (DER) in future power systems. At least two major European R and D projects (MicroGrids and CRISP) have investigated its potential. Both grid-related as well as market-related applications have been studied. This paper will focus on two field tests, performed in the Netherlands, applying multi-agent control by means of the PowerMatcher concept. The first field test focuses on the application of multi-agent technology in a commercial setting, i.e. by reducing the need for balancing power in the case of intermittent energy sources, such as wind energy. In this case the flexibility is used of demand and supply of industrial and residential consumers and producers. Imbalance reduction rates of over 40% have been achieved applying the PowerMatcher, and with a proper portfolio even larger rates are expected. In the second field test the multi-agent technology is used in the design and implementation of a virtual power plant (VPP). This VPP digitally connects a number of micro-CHP units, installed in residential dwellings, into a cluster that is controlled to reduce the local peak demand of the common low-voltage grid segment the micro-CHP units are connected to. In this way the VPP supports the local distribution system operator (DSO) to defer reinforcements in the grid infrastructure (substations and cables)

  4. Shell model test of the Porter-Thomas distribution

    International Nuclear Information System (INIS)

    Grimes, S.M.; Bloom, S.D.

    1981-01-01

    Eigenvectors have been calculated for the A=18, 19, 20, 21, and 26 nuclei in an sd shell basis. The decomposition of these states into their shell model components shows, in agreement with other recent work, that this distribution is not a single Gaussian. We find that the largest amplitudes are distributed approximately in a Gaussian fashion. Thus, many experimental measurements should be consistent with the Porter-Thomas predictions. We argue that the non-Gaussian form of the complete distribution can be simply related to the structure of the Hamiltonian

  5. Nonlinear analysis of field distribution in electric motor with periodicity conditions

    Energy Technology Data Exchange (ETDEWEB)

    Stabrowski, M M; Sikora, J

    1981-01-01

    Numerical analysis of electromagnetic field distribution in linear motion tubular electric motor has been performed with the aid of finite element method. Two Fortran programmes for the solution of DBBF and BF large linear symmetric equation systems have been developed for purposes of this analysis. A new iterative algorithm, taking into account iron nonlinearity and periodicity conditions, has been introduced. Final results of the analysis in the form of induction diagrammes and motor driving force are directly useful for motor designers.

  6. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  7. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  8. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  9. Full integrated system of real-time monitoring based on distributed architecture for the high temperature engineering test reactor (HTTR)

    International Nuclear Information System (INIS)

    Subekti, Muhammad; Ohno, Tomio; Kudo, Kazuhiko; Takamatsu, Kuniyoshi; Nabeshima, Kunihiko

    2005-01-01

    A new monitoring system scheme based on distributed architecture for the High Temperature Engineering Test Reactor (HTTR) is proposed to assure consistency of the real-time process of expanded system. A distributed monitoring task on client PCs as an alternative architecture maximizes the throughput and capabilities of the system even if the monitoring tasks suffer a shortage of bandwidth. The prototype of the on-line monitoring system has been developed successfully and will be tested at the actual HTTR site. (author)

  10. Finite Element Analysis and Test Results Comparison for the Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.

    2016-01-01

    This report documents the comparison of test measurements and predictive finite element analysis results for a hybrid wing body center section test article. The testing and analysis efforts were part of the Airframe Technology subproject within the NASA Environmentally Responsible Aviation project. Test results include full field displacement measurements obtained from digital image correlation systems and discrete strain measurements obtained using both unidirectional and rosette resistive gauges. Most significant results are presented for the critical five load cases exercised during the test. Final test to failure after inflicting severe damage to the test article is also documented. Overall, good comparison between predicted and actual behavior of the test article is found.

  11. The PUMA test program and data analysis

    International Nuclear Information System (INIS)

    Han, J.T.; Morrison, D.L.

    1997-01-01

    The PUMA test program is sponsored by the U.S. Nuclear Regulatory Commission to provide data that are relevant to various Boiling Water Reactor phenomena. The author briefly describes the PUMA test program and facility, presents the objective of the program, provides data analysis for a large-break loss-of-coolant accident test, and compares the data with a RELAP5/MOD 3.1.2 calculation

  12. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  13. Distributed Resource Energy Analysis and Management System (DREAMS) Development for Real-time Grid Operations

    Energy Technology Data Exchange (ETDEWEB)

    Nakafuji, Dora [Hawaiian Electric Company, Honululu, HI (United States); Gouveia, Lauren [Hawaiian Electric Company, Honululu, HI (United States)

    2016-10-24

    This project supports development of the next generation, integrated energy management infrastructure (EMS) able to incorporate advance visualization of behind-the-meter distributed resource information and probabilistic renewable energy generation forecasts to inform real-time operational decisions. The project involves end-users and active feedback from an Utility Advisory Team (UAT) to help inform how information can be used to enhance operational functions (e.g. unit commitment, load forecasting, Automatic Generation Control (AGC) reserve monitoring, ramp alerts) within two major EMS platforms. Objectives include: Engaging utility operations personnel to develop user input on displays, set expectations, test and review; Developing ease of use and timeliness metrics for measuring enhancements; Developing prototype integrated capabilities within two operational EMS environments; Demonstrating an integrated decision analysis platform with real-time wind and solar forecasting information and timely distributed resource information; Seamlessly integrating new 4-dimensional information into operations without increasing workload and complexities; Developing sufficient analytics to inform and confidently transform and adopt new operating practices and procedures; Disseminating project lessons learned through industry sponsored workshops and conferences;Building on collaborative utility-vendor partnership and industry capabilities

  14. Methods of assessing grain-size distribution during grain growth

    DEFF Research Database (Denmark)

    Tweed, Cherry J.; Hansen, Niels; Ralph, Brian

    1985-01-01

    This paper considers methods of obtaining grain-size distributions and ways of describing them. In order to collect statistically useful amounts of data, an automatic image analyzer is used, and the resulting data are subjected to a series of tests that evaluate the differences between two related...... distributions (before and after grain growth). The distributions are measured from two-dimensional sections, and both the data and the corresponding true three-dimensional grain-size distributions (obtained by stereological analysis) are collected. The techniques described here are illustrated by reference...

  15. Development of a Test Facility to Simulate the Reactor Flow Distribution of APR+

    International Nuclear Information System (INIS)

    Euh, D. J.; Cho, S.; Youn, Y. J.; Kim, J. T.; Kang, H. S.; Kwon, T. S.

    2011-01-01

    Recently a design of new reactor, APR+, is being developed, as an advanced type of APR1400. In order to analyze the thermal margin and hydraulic characteristics of APR+, quantification tests for flow and pressure distribution with a conservation of flow geometry are necessary. Hetsroni (1967) proposed four principal parameters for a hydraulic model representing a nuclear reactor prototype: geometry, relative roughness, Reynolds number, and Euler number. He concluded that the Euler number should be similar in the prototype and model under the preservation of the aspect ratio on the flow path. The effect of the Reynolds number at its higher values on the Euler number is rather small, since the dependency of the form and frictional loss coefficients on the Reynolds number is seen to be small. ABB-CE has carried out several reactor flow model test programs, mostly for its prototype reactors. A series of tests were conducted using a 3/16 scale reactor model. (see Lee et al., 2001). Lee et al (1991) performed experimental studies using a 1/5.03 scale reactor flow model of Yonggwang nuclear units 3 and 4. They showed that the measured data met the acceptance criteria and were suitable for their intended use in terms of performance and safety analyses. The design of current test facility was based on the conservation of Euler number which is a ratio of pressure drop to dynamic pressure with a sufficiently turbulent region having a high Reynolds number. By referring to the previous study, the APR+ design is linearly reduced to 1/5 ratio with a 1/2 of the velocity scale, which yields a 1/39.7 of Reynolds number scaling ratio. In the present study, the design feature of the facilities, named 'ACOP', in order to investigate flow and pressure distribution are described

  16. Supercritical water oxidation benchscale testing metallurgical analysis report

    International Nuclear Information System (INIS)

    Norby, B.C.

    1993-02-01

    This report describes metallurgical evaluation of witness wires from a series of tests using supercritical water oxidation (SCWO) to process cutting oil containing a simulated radionuclide. The goal of the tests was to evaluate the technology's ability to process a highly chlorinated waste representative of many mixed waste streams generated in the DOE complex. The testing was conducted with a bench-scale SCWO system developed by the Modell Development Corporation. Significant test objectives included process optimization for adequate destruction efficiency, tracking the radionuclide simulant and certain metals in the effluent streams, and assessment of reactor material degradation resulting from processing a highly chlorinated waste. The metallurgical evaluation described herein includes results of metallographic analysis and Scanning Electron Microscopy analysis of witness wires exposed to the SCWO environment for one test series

  17. U.S.: proposed federal legislation to allow condom distribution and HIV testing in prison.

    Science.gov (United States)

    Dolinsky, Anna

    2007-05-01

    Representative Barbara Lee (D-CA) is reintroducing legislation in the U.S. House of Representatives that would require federal correctional facilities to allow community organizations to distribute condoms and provide voluntary counselling and testing for HIV and STDs for inmates. The bill has been referred to the House Judiciary Committee's Subcommittee on Crime, Terrorism, and Homeland Security.

  18. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  19. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  20. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.