WorldWideScience

Sample records for distributed analysis test

  1. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  2. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  3. Similarity Analysis for Reactor Flow Distribution Test and Its Validation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon; Ha, Jung Hui [Heungdeok IT Valley, Yongin (Korea, Republic of); Lee, Taehoo; Han, Ji Woong [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    facility. It was clearly found in Hong et al. In this study the feasibility of the similarity analysis of Hong et al. was examined. The similarity analysis was applied to SFR which has been designed in KAERI (Korea Atomic Energy Research Institute) in order to design the reactor flow distribution test. The length scale was assumed to be 1/5, and the velocity scale 1/2, which bounds the square root of the length scale (1/√5). The CFX calculations for both prototype and model were carried out and the flow field was compared.

  4. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  5. An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

    Science.gov (United States)

    Attali, Yigal

    2010-01-01

    Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

  6. Distributed analysis functional testing using GangaRobot in the ATLAS experiment

    Science.gov (United States)

    Legger, Federica; ATLAS Collaboration

    2011-12-01

    Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.

  7. The Analysis of process optimization during the loading distribution test for steam turbine

    International Nuclear Information System (INIS)

    Li Jiangwei; Cao Yuhua; Li Dawei

    2014-01-01

    The loading distribution of steam turbine needs six times to complete in total, the first time is completed when the turbine cylinder buckles, the rest must be completed orderly in the process of installing GVP pipe. To complete 5 tests of loading distribution and installation of GVP pipe, it usually takes around 90 days for most nuclear plants while the unit l of Fuqing Nuclear Power Station compress it into about 45 days by optimizing the installation process. this article describes the successful experience of how the Unit l of Fuqing Nuclear Power Station finished 5 tests of loading distribution and installation of GVP pipe in 45 days by optimizing the process, Meanwhile they analysis the advantages and disadvantages through comparing it with the process provide by suppliers, which brings up some rationalization proposals for installation work to the follow-up units of our plant. (authors)

  8. Local flow distribution analysis inside the reactor pools of KALIMER-600 and PDRC performance test facility

    International Nuclear Information System (INIS)

    Jeong, Ji Hwan; Hwang, Seong Won; Choi, Kyeong Sik

    2010-05-01

    In the study, 3-dimensional thermal hydraulic analysis was carried out focusing on the thermal hydraulic behavior inside the reactor pools for both KALIMER-600 and one-fifth scale-down test facility. STAR-CD, one of the commercial CFD codes, was used to analyze 3-dimensional incompressible steady-state thermal hydraulic behavior in both designs of KALIMER-600 and the scale-down test facility. In the KALIMER-600 CFD analysis, the pressure drops in the core and IHX gave a good agreement within 1% error range. It was found that the porous media model was appropriate to analyze the pressure distribution inside reactor core and IHX. Also, a validation analysis showed the pressure drop through the porous media under the condition of 80% flow rate and thermal power was calculated 64% less than in 100% condition giving a physically reasonable analytic result. Since the temperatures in the hot-side pool and cold-side pool were estimated to be very close to 540 and 390 .deg. C specified on the design values respectively, the CFD models of heat source and sink was confirmed. Through the study, the methodology of 3-dimensional CFD analysis about KALIMER-600 has been established and proven. Performed with the methodology, the analysis data such as flow velocity, temperature and pressure distribution were compared by normalizing those data for the actual sized modeling and scale-down modeling. As a result, the characteristics of thermal hydraulic behavior were almost identical for the actual sized modeling and scale-down modeling and the similarity scaling law used in the design of the sodium test facility by KAERI was found to be correct

  9. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging

    Science.gov (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.

    2016-08-01

    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  10. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  11. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  12. Modification of Kolmogorov-Smirnov test for DNA content data analysis through distribution alignment.

    Science.gov (United States)

    Huang, Shuguang; Yeo, Adeline A; Li, Shuyu Dan

    2007-10-01

    The Kolmogorov-Smirnov (K-S) test is a statistical method often used for comparing two distributions. In high-throughput screening (HTS) studies, such distributions usually arise from the phenotype of independent cell populations. However, the K-S test has been criticized for being overly sensitive in applications, and it often detects a statistically significant difference that is not biologically meaningful. One major reason is that there is a common phenomenon in HTS studies that systematic drifting exists among the distributions due to reasons such as instrument variation, plate edge effect, accidental difference in sample handling, etc. In particular, in high-content cellular imaging experiments, the location shift could be dramatic since some compounds themselves are fluorescent. This oversensitivity of the K-S test is particularly overpowered in cellular assays where the sample sizes are very big (usually several thousands). In this paper, a modified K-S test is proposed to deal with the nonspecific location-shift problem in HTS studies. Specifically, we propose that the distributions are "normalized" by density curve alignment before the K-S test is conducted. In applications to simulation data and real experimental data, the results show that the proposed method has improved specificity.

  13. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  14. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  15. Transient dynamic finite element analysis of hydrogen distribution test chamber structure for hydrogen combustion loads

    International Nuclear Information System (INIS)

    Singh, R.K.; Redlinger, R.; Breitung, W.

    2005-09-01

    Design and analysis of blast resistant structures is an important area of safety research in nuclear, aerospace, chemical process and vehicle industries. Institute for Nuclear and Energy Technologies (IKET) of Research Centre- Karlsruhe (Forschungszentrum Karlsruhe or FZK) in Germany is pursuing active research on the entire spectrum of safety evaluation for efficient hydrogen management in case of the postulated design basis and beyond the design basis severe accidents for nuclear and non-nuclear applications. This report concentrates on the consequence analysis of hydrogen combustion accidents with emphasis on the structural safety assessment. The transient finite element simulation results obtained for 2gm, 4gm, 8gm and 16gm hydrogen combustion experiments concluded recently on the test-cell structure are described. The frequencies and damping of the test-cell observed during the hammer tests and the combustion experiments are used for the present three dimensional finite element model qualification. For the numerical transient dynamic evaluation of the test-cell structure, the pressure time history data computed with CFD code COM-3D is used for the four combustion experiments. Detail comparisons of the present numerical results for the four combustion experiments with the observed time signals are carried out to evaluate the structural connection behavior. For all the combustion experiments excellent agreement is noted for the computed accelerations and displacements at the standard transducer locations, where the measurements were made during the different combustion tests. In addition inelastic analysis is also presented for the test-cell structure to evaluate the limiting impulsive and quasi-static pressure loads. These results are used to evaluate the response of the test cell structure for the postulated over pressurization of the test-cell due to the blast load generated in case of 64 gm hydrogen ignition for which additional sets of computations were

  16. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  17. Distribution and presentation of Lyme borreliosis in Scotland - analysis of data from a national testing laboratory.

    Science.gov (United States)

    Mavin, S; Watson, E J; Evans, R

    2015-01-01

    This study examines the distribution of laboratory-confirmed cases of Lyme borreliosis in Scotland and the clinical spectrum of presentations within NHS Highland. Methods General demographic data (age/sex/referring Health Board) from all cases of Lyme borreliosis serologically confirmed by the National Lyme Borreliosis Testing Laboratory from 1 January 2008 to 31 December 2013 were analysed. Clinical features of confirmed cases were ascertained from questionnaires sent to referring clinicians within NHS Highland during the study period. Results The number of laboratory-confirmed cases of Lyme borreliosis in Scotland peaked at 440 in 2010. From 2008 to 2013 the estimated average annual incidence was 6.8 per 100,000 (44.1 per 100,000 in NHS Highland). Of 594 questionnaires from NHS Highland patients: 76% had clinically confirmed Lyme borreliosis; 48% erythema migrans; 17% rash, 25% joint, 15% neurological and 1% cardiac symptoms. Only 61% could recall a tick bite. Conclusion The incidence of Lyme borreliosis may be stabilising in Scotland but NHS Highland remains an area of high incidence. Lyme borreliosis should be considered in symptomatic patients that have had exposure to ticks and not just those with a definite tick bite.

  18. Distribution of ground rigidity and ground model for seismic response analysis in Hualian project of large scale seismic test

    International Nuclear Information System (INIS)

    Kokusho, T.; Nishi, K.; Okamoto, T.; Tanaka, Y.; Ueshima, T.; Kudo, K.; Kataoka, T.; Ikemi, M.; Kawai, T.; Sawada, Y.; Suzuki, K.; Yajima, K.; Higashi, S.

    1997-01-01

    An international joint research program called HLSST is proceeding. HLSST is large-scale seismic test (LSST) to investigate soil-structure interaction (SSI) during large earthquake in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the gravelly soil in this site, and the backfill material of crushed stone was placed around the model plant after excavation for the construction. Also the model building and the foundation ground were extensively instrumental to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after base excavation, after structure construction and after backfilling. And the distribution of the mechanical properties of the gravelly soil and the backfill are measured after the completion of the construction by penetration test and PS-logging etc. This paper describes the distribution and the change of the shear wave velocity (V s ) measured by the field test. Discussion is made on the effect of overburden pressure during the construction process on V s in the neighbouring soil and, further on the numerical soil model for SSI analysis. (orig.)

  19. Analysis of Radial Plutonium Isotope Distribution in Irradiated Test MOX Fuel Rods

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae Yong; Lee, Byung Ho; Koo, Yang Hyun; Kim, Han Soo

    2009-01-15

    After Rod 3 and 6 (KAERI MOX) were irradiated in the Halden reactor, their post-irradiation examinations are being carried out now. In this report, PLUTON code was implemented to analyze Rod 3 and 6 (KAERI MOX). In the both rods, the ratio of a maximum burnup to an average burnup in the radial distribution was 1.3 and the contents of {sup 239}Pu tended to increase as the radial position approached the periphery of the fuel pellet. The detailed radial distribution of {sup 239}Pu and {sup 240}Pu, however, were somewhat different. To find the reason for this difference, the contents of Pu isotopes were investigated as the burnup increased. The content of {sup 239}Pu decreased with the burnup. The content of {sup 240}Pu increased with the burnup by the 20 GWd/tM but decreased over the 20 GWd/tM. The local burnup of Rod 3 is higher than that of Rod 6 due to the hole penetrated through the fuel rod. The content of {sup 239}Pu decreased more rapidly than that of {sup 240}Pu in the Rod 6 with the increased burnup. It resulted in a radial distribution of {sup 239}Pu and {sup 240}Pu similar to Rod 3. The ratio of Xe to Kr is a parameter to find where the fissions occur in the nuclear fuel. In both Rod 3 and 6, it was 18.3 in the whole fuel rod cross section, which showed that the fissions occurred in the plutonium.

  20. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  1. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  2. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  3. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  4. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  5. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  6. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  7. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  8. Auxiliary Heat Exchanger Flow Distribution Test

    International Nuclear Information System (INIS)

    Kaufman, J.S.; Bressler, M.M.

    1983-01-01

    The Auxiliary Heat Exchanger Flow Distribution Test was the first part of a test program to develop a water-cooled (tube-side), compact heat exchanger for removing heat from the circulating gas in a high-temperature gas-cooled reactor (HTGR). Measurements of velocity and pressure were made with various shell side inlet and outlet configurations. A flow configuration was developed which provides acceptable velocity distribution throughout the heat exchanger without adding excessive pressure drop

  9. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  10. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  11. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  12. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  13. Model-Driven Test Generation of Distributed Systems

    Science.gov (United States)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  14. Multipath interference test method for distributed amplifiers

    Science.gov (United States)

    Okada, Takahiro; Aida, Kazuo

    2005-12-01

    A method for testing distributed amplifiers is presented; the multipath interference (MPI) is detected as a beat spectrum between the multipath signal and the direct signal using a binary frequency shifted keying (FSK) test signal. The lightwave source is composed of a DFB-LD that is directly modulated by a pulse stream passing through an equalizer, and emits the FSK signal of the frequency deviation of about 430MHz at repetition rate of 80-100 kHz. The receiver consists of a photo-diode and an electrical spectrum analyzer (ESA). The base-band power spectrum peak appeared at the frequency of the FSK frequency deviation can be converted to amount of MPI using a calibration chart. The test method has improved the minimum detectable MPI as low as -70 dB, compared to that of -50 dB of the conventional test method. The detailed design and performance of the proposed method are discussed, including the MPI simulator for calibration procedure, computer simulations for evaluating the error caused by the FSK repetition rate and the fiber length under test and experiments on singlemode fibers and distributed Raman amplifier.

  15. Sodium flow distribution in test fuel assembly P-23B

    International Nuclear Information System (INIS)

    Taylor, J.P.S.

    1978-08-01

    Relatively large cladding diametral increases in the exterior fuel pins of HEDL's test fuel subassembly P-23B were successfully explained by a thermal-hydraulic/solid mechanics analysis. This analysis indicates that while at power, the subassembly flow was less than planned and that the fuel pins were considerably displaced and bowed from their nominal position. In accomplishing this analysis, a method was developed to estimate the sodium flow distribution and pin distortions in a fuel subassembly at power

  16. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  17. Quantitative analysis of the fission product distribution in a damaged fuel assembly using gamma-spectrometry and computed tomography for the Phébus FPT3 test

    International Nuclear Information System (INIS)

    Biard, B.

    2013-01-01

    (locate and identify the materials and estimate their density with the X-ray tomograms, locate the FP distribution inside the bundle with the gamma emission tomograms) and to automate the processing of the gamma spectra acquired. The specificities of these gamma spectra (high count rate, number of gamma rays, number of measurements, etc.) required in particular to analyse key lines only and needed an original counting loss correction. The method was validated over the pre-test examination of the fuel bundle, through a comparison with the classical gamma analysis method used at the laboratory for objects of known geometry. The final results, given with acceptable uncertainties, gave for all FPs identified (mainly 137 Cs, 131 I, 132 Te, 140 Ba, 95 Zr, 103 Ru, etc.) their quantitative activity profile along the bundle, their retained and released fractions in the bundle, and also some information about their relocation inside the bundle. The results are in very good agreement with other Phébus FPT3 measurements and inventory calculations

  18. Quantitative analysis of the fission product distribution in a damaged fuel assembly using gamma-spectrometry and computed tomography for the Phébus FPT3 test

    Energy Technology Data Exchange (ETDEWEB)

    Biard, B., E-mail: bruno.biard@irsn.fr

    2013-09-15

    (locate and identify the materials and estimate their density with the X-ray tomograms, locate the FP distribution inside the bundle with the gamma emission tomograms) and to automate the processing of the gamma spectra acquired. The specificities of these gamma spectra (high count rate, number of gamma rays, number of measurements, etc.) required in particular to analyse key lines only and needed an original counting loss correction. The method was validated over the pre-test examination of the fuel bundle, through a comparison with the classical gamma analysis method used at the laboratory for objects of known geometry. The final results, given with acceptable uncertainties, gave for all FPs identified (mainly {sup 137}Cs, {sup 131}I, {sup 132}Te, {sup 140}Ba, {sup 95}Zr, {sup 103}Ru, etc.) their quantitative activity profile along the bundle, their retained and released fractions in the bundle, and also some information about their relocation inside the bundle. The results are in very good agreement with other Phébus FPT3 measurements and inventory calculations.

  19. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  20. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  1. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  2. 10 CFR 431.198 - Enforcement testing for distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Enforcement testing for distribution transformers. 431.198... COMMERCIAL AND INDUSTRIAL EQUIPMENT Distribution Transformers Compliance and Enforcement § 431.198 Enforcement testing for distribution transformers. (a) Test notice. Upon receiving information in writing...

  3. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  4. Simulation-Based Testing of Distributed Systems

    National Research Council Canada - National Science Library

    Rutherford, Matthew J; Carzaniga, Antonio; Wolf, Alexander L

    2006-01-01

    .... Typically written using an imperative programming language, these simulations capture basic algorithmic functionality at the same time as they focus attention on properties critical to distribution...

  5. Moisture distribution in sludges based on different testing methods

    Institute of Scientific and Technical Information of China (English)

    Wenyi Deng; Xiaodong Li; Jianhua Yan; Fei Wang; Yong Chi; Kefa Cen

    2011-01-01

    Moisture distributions in municipal sewage sludge, printing and dyeing sludge and paper mill sludge were experimentally studied based on four different methods, i.e., drying test, thermogravimetric-differential thermal analysis (TG-DTA) test, thermogravimetricdifferential scanning calorimetry (TG-DSC) test and water activity test. The results indicated that the moistures in the mechanically dewatered sludges were interstitial water, surface water and bound water. The interstitial water accounted for more than 50% wet basis (wb) of the total moisture content. The bond strength of sludge moisture increased with decreasing moisture content, especially when the moisture content was lower than 50% wb. Furthermore, the comparison among the four different testing methods was presented.The drying test was advantaged by its ability to quantify free water, interstitial water, surface water and bound water; while TG-DSC test, TG-DTA test and water activity test were capable of determining the bond strength of moisture in sludge. It was found that the results from TG-DSC and TG-DTA test are more persuasive than water activity test.

  6. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  7. Semen Analysis Test

    Science.gov (United States)

    ... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...

  8. TTCN-3 for distributed testing embedded systems

    NARCIS (Netherlands)

    Blom, S.C.C.; Deiß, T.; Ioustinova, N.; Kontio, A.; Pol, van de J.C.; Rennoch, A.; Sidorova, N.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    Abstract. TTCN-3 is a standardized language for specifying and executing test suites that is particularly popular for testing embedded systems. Prior to testing embedded software in a target environment, the software is usually tested in the host environment. Executing in the host environment often

  9. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  10. Distributed temperature sensor testing in liquid sodium

    Energy Technology Data Exchange (ETDEWEB)

    Gerardi, Craig, E-mail: cgerardi@anl.gov; Bremer, Nathan; Lisowski, Darius; Lomperski, Stephen

    2017-02-15

    Highlights: • Distributed temperature sensors measured high-resolution liquid-sodium temperatures. • DTSs worked well up to 400 °C. • A single DTS simultaneously detected sodium level and temperature. - Abstract: Rayleigh-backscatter-based distributed fiber optic sensors were immersed in sodium to obtain high-resolution liquid-sodium temperature measurements. Distributed temperature sensors (DTSs) functioned well up to 400 °C in a liquid sodium environment. The DTSs measured sodium column temperature and the temperature of a complex geometrical pattern that leveraged the flexibility of fiber optics. A single Ø 360 μm OD sensor registered dozens of temperatures along a length of over one meter at 100 Hz. We also demonstrated the capability to use a single DTS to simultaneously detect thermal interfaces (e.g. sodium level) and measure temperature.

  11. Goodness-of-fit tests for a heavy tailed distribution

    NARCIS (Netherlands)

    A.J. Koning (Alex); L. Peng (Liang)

    2005-01-01

    textabstractFor testing whether a distribution function is heavy tailed, we study the Kolmogorov test, Berk-Jones test, score test and their integrated versions. A comparison is conducted via Bahadur efficiency and simulations. The score test and the integrated score test show the best

  12. Distributed temperature sensor testing in liquid sodium

    Energy Technology Data Exchange (ETDEWEB)

    Gerardi, Craig; Bremer, Nathan; Lisowski, Darius; Lomperski, Stephen

    2017-02-01

    Rayleigh-backscatter-based distributed fiber optic sensors were immersed in sodium to obtain high-resolution liquid-sodium temperature measurements. Distributed temperature sensors (DTSs) functioned well up to 400°C in a liquid sodium environment. The DTSs measured sodium column temperature and the temperature of a complex geometrical pattern that leveraged the flexibility of fiber optics. A single Ø 360 lm OD sensor registered dozens of temperatures along a length of over one meter at 100 Hz. We also demonstrated the capability to use a single DTS to simultaneously detect thermal interfaces (e.g. sodium level) and measure temperature.

  13. Factors affecting daughters distribution among progeny testing Holstein bulls

    Directory of Open Access Journals (Sweden)

    Martino Cassandro

    2012-01-01

    Full Text Available The aim of this study was to investigate factors influencing the number of daughters of Holstein bulls during the progeny testing using data provided by the Italian Holstein Friesian Cattle Breeders Association. The hypothesis is that there are no differences among artificial insemination studs (AIS on the daughters distribution among progeny testing bulls. For each bull and beginning from 21 months of age, the distribution of daughters over the progeny testing period was calculated. Data were available on 1973 bulls born between 1986 and 2004, progeny tested in Italy and with at least 4 paternal half-sibs. On average, bulls exited the genetic centre at 11.3±1.1 months and reached their first official genetic proof at 58.0±3.1 months of age. An analysis of variance was performed on the cumulative frequency of daughters at 24, 36, 48, and 60 months. The generalized linear model included the fixed effects of year of birth of the bull (18 levels, artificial insemination stud (4 levels and sire of bull (137 levels. All effects significantly affected the variability of studied traits. Artificial insemination stud was the most important source of variation, followed by year of birth and sire of bull. Significant differences among AI studs exist, probably reflecting different strategies adopted during progeny testing.

  14. Test report light duty utility arm power distribution system (PDS)

    International Nuclear Information System (INIS)

    Clark, D.A.

    1996-01-01

    The Light Duty Utility Arm (LDUA) Power Distribution System has completed vendor and post-delivery acceptance testing. The Power Distribution System has been found to be acceptable and is now ready for integration with the overall LDUA system

  15. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  16. Goodness-of-fit tests for the Gompertz distribution

    DEFF Research Database (Denmark)

    Lenart, Adam; Missov, Trifon

    The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing ...... for the mean of the sample hazard and a nested test against the generalized extreme value distributions are discussed. Along with an application to laboratory rat data, critical values calculated by the empirical distribution of the test statistics are also presented.......The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing...

  17. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  18. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  19. Statistical test for the distribution of galaxies on plates

    International Nuclear Information System (INIS)

    Garcia Lambas, D.

    1985-01-01

    A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)

  20. 21 CFR 211.165 - Testing and release for distribution.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Laboratory Controls § 211.165 Testing and release for distribution. (a) For each batch of drug product, there shall be... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Testing and release for distribution. 211.165...

  1. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  2. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  3. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  4. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  5. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  6. Distributed Rocket Engine Testing Health Monitoring System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The on-ground and Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) provides a system architecture and software tools for performing diagnostics...

  7. Distributed Rocket Engine Testing Health Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Leveraging the Phase I achievements of the Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) including its software toolsets and system building...

  8. statistical tests for frequency distribution of mean gravity anomalies

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...

  9. Numerical distribution functions of fractional unit root and cointegration tests

    DEFF Research Database (Denmark)

    MacKinnon, James G.; Nielsen, Morten Ørregaard

    We calculate numerically the asymptotic distribution functions of likelihood ratio tests for fractional unit roots and cointegration rank. Because these distributions depend on a real-valued parameter, b, which must be estimated, simple tabulation is not feasible. Partly due to the presence...

  10. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  11. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  12. A brief overview of the distribution test grids with a distributed generation inclusion case study

    Directory of Open Access Journals (Sweden)

    Stanisavljević Aleksandar M.

    2018-01-01

    Full Text Available The paper presents an overview of the electric distribution test grids issued by different technical institutions. They are used for testing different scenarios in operation of a grid for research, benchmarking, comparison and other purposes. Their types, main characteristics, features as well as application possibilities are shown. Recently, these grids are modified with inclusion of distributed generation. An example of modification and application of the IEEE 13-bus for testing effects of faults in cases without and with a distributed generator connection to the grid is presented. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 042004: Smart Electricity Distribution Grids Based on Distribution Management System and Distributed Generation

  13. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  14. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  15. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  16. Materials Science Research Rack-1 Fire Suppressant Distribution Test Report

    Science.gov (United States)

    Wieland, P. O.

    2002-01-01

    Fire suppressant distribution testing was performed on the Materials Science Research Rack-1 (MSRR-1), a furnace facility payload that will be installed in the U.S. Lab module of the International Space Station. Unlike racks that were tested previously, the MSRR-1 uses the Active Rack Isolation System (ARIS) to reduce vibration on experiments, so the effects of ARIS on fire suppressant distribution were unknown. Two tests were performed to map the distribution of CO2 fire suppressant throughout a mockup of the MSRR-1 designed to have the same component volumes and flowpath restrictions as the flight rack. For the first test, the average maximum CO2 concentration for the rack was 60 percent, achieved within 45 s of discharge initiation, meeting the requirement to reach 50 percent throughout the rack within 1 min. For the second test, one of the experiment mockups was removed to provide a worst-case configuration, and the average maximum CO2 concentration for the rack was 58 percent. Comparing the results of this testing with results from previous testing leads to several general conclusions that can be used to evaluate future racks. The MSRR-1 will meet the requirements for fire suppressant distribution. Primary factors that affect the ability to meet the CO2 distribution requirements are the free air volume in the rack and the total area and distribution of openings in the rack shell. The length of the suppressant flowpath and degree of tortuousness has little correlation with CO2 concentration. The total area of holes in the rack shell could be significantly increased. The free air volume could be significantly increased. To ensure the highest maximum CO2 concentration, the PFE nozzle should be inserted to the stop on the nozzle.

  17. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  18. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  19. Independent test assessment using the extreme value distribution theory.

    Science.gov (United States)

    Almeida, Marcio; Blondell, Lucy; Peralta, Juan M; Kent, Jack W; Jun, Goo; Teslovich, Tanya M; Fuchsberger, Christian; Wood, Andrew R; Manning, Alisa K; Frayling, Timothy M; Cingolani, Pablo E; Sladek, Robert; Dyer, Thomas D; Abecasis, Goncalo; Duggirala, Ravindranath; Blangero, John

    2016-01-01

    The new generation of whole genome sequencing platforms offers great possibilities and challenges for dissecting the genetic basis of complex traits. With a very high number of sequence variants, a naïve multiple hypothesis threshold correction hinders the identification of reliable associations by the overreduction of statistical power. In this report, we examine 2 alternative approaches to improve the statistical power of a whole genome association study to detect reliable genetic associations. The approaches were tested using the Genetic Analysis Workshop 19 (GAW19) whole genome sequencing data. The first tested method estimates the real number of effective independent tests actually being performed in whole genome association project by the use of an extreme value distribution and a set of phenotype simulations. Given the familiar nature of the GAW19 data and the finite number of pedigree founders in the sample, the number of correlations between genotypes is greater than in a set of unrelated samples. Using our procedure, we estimate that the effective number represents only 15 % of the total number of independent tests performed. However, even using this corrected significance threshold, no genome-wide significant association could be detected for systolic and diastolic blood pressure traits. The second approach implements a biological relevance-driven hypothesis tested by exploiting prior computational predictions on the effect of nonsynonymous genetic variants detected in a whole genome sequencing association study. This guided testing approach was able to identify 2 promising single-nucleotide polymorphisms (SNPs), 1 for each trait, targeting biologically relevant genes that could help shed light on the genesis of the human hypertension. The first gene, PFH14 , associated with systolic blood pressure, interacts directly with genes involved in calcium-channel formation and the second gene, MAP4 , encodes a microtubule-associated protein and had already

  20. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  1. A multivariate rank test for comparing mass size distributions

    KAUST Repository

    Lombard, F.

    2012-04-01

    Particle size analyses of a raw material are commonplace in the mineral processing industry. Knowledge of particle size distributions is crucial in planning milling operations to enable an optimum degree of liberation of valuable mineral phases, to minimize plant losses due to an excess of oversize or undersize material or to attain a size distribution that fits a contractual specification. The problem addressed in the present paper is how to test the equality of two or more underlying size distributions. A distinguishing feature of these size distributions is that they are not based on counts of individual particles. Rather, they are mass size distributions giving the fractions of the total mass of a sampled material lying in each of a number of size intervals. As such, the data are compositional in nature, using the terminology of Aitchison [1] that is, multivariate vectors the components of which add to 100%. In the literature, various versions of Hotelling\\'s T 2 have been used to compare matched pairs of such compositional data. In this paper, we propose a robust test procedure based on ranks as a competitor to Hotelling\\'s T 2. In contrast to the latter statistic, the power of the rank test is not unduly affected by the presence of outliers or of zeros among the data. © 2012 Copyright Taylor and Francis Group, LLC.

  2. Economic analysis of efficient distribution transformer trends

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  3. 242A Distributed Control System Year 2000 Acceptance Test Report

    Energy Technology Data Exchange (ETDEWEB)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.

  4. 242A Distributed Control System Year 2000 Acceptance Test Report

    International Nuclear Information System (INIS)

    TEATS, M.C.

    1999-01-01

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3(trademark) (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each

  5. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  6. Nevada test site radionuclide inventory and distribution: project operations plan

    International Nuclear Information System (INIS)

    Kordas, J.F.; Anspaugh, L.R.

    1982-01-01

    This document is the operational plan for conducting the Radionuclide Inventory and Distribution Program (RIDP) at the Nevada Test Site (NTS). The basic objective of this program is to inventory the significant radionuclides of NTS origin in NTS surface soil. The expected duration of the program is five years. This plan includes the program objectives, methods, organization, and schedules

  7. Sodium flow distribution test of the air cooler tubes

    International Nuclear Information System (INIS)

    Uchida, Hiroyuki; Ohta, Hidehisa; Shimazu, Hisashi

    1980-01-01

    In the heat transfer tubes of the air cooler which is installed in the auxiliary core cooling system of the fast breeder prototype plant reactor ''Monju'', sodium freezing may be caused by undercooling the sodium induced by an extremely unbalanced sodium flow in the tubes. Thus, the sodium flow distribution test of the air cooler tubes was performed to examine the flow distribution of the tubes and to estimate the possibility of sodium freezing in the tubes. This test was performed by using a one fourth air cooler model installed in the water flow test facility. As the test results show, the flow distribution from the inlet header to each tube is almost equal at any operating condition, that is, the velocity deviation from normalized mean velocity is less than 6% and sodium freezing does not occur up to 250% air velocity deviation at stand-by condition. It was clear that the proposed air cooler design for the ''Monju'' will have a good sodium flow distribution at any operating condition. (author)

  8. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  9. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  10. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  11. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  12. A Test Generation Framework for Distributed Fault-Tolerant Algorithms

    Science.gov (United States)

    Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.

    2009-01-01

    Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.

  13. LEDA RF distribution system design and component test results

    International Nuclear Information System (INIS)

    Roybal, W.T.; Rees, D.E.; Borchert, H.L.; McCarthy, M.; Toole, L.

    1998-01-01

    The 350 MHz and 700 MHz RF distribution systems for the Low Energy Demonstration Accelerator (LEDA) have been designed and are currently being installed at Los Alamos National Laboratory. Since 350 MHz is a familiar frequency used at other accelerator facilities, most of the major high-power components were available. The 700 MHz, 1.0 MW, CW RF delivery system designed for LEDA is a new development. Therefore, high-power circulators, waterloads, phase shifters, switches, and harmonic filters had to be designed and built for this applications. The final Accelerator Production of Tritium (APT) RF distribution systems design will be based on much of the same technology as the LEDA systems and will have many of the RF components tested for LEDA incorporated into the design. Low power and high-power tests performed on various components of these LEDA systems and their results are presented here

  14. Testing the anisotropy in the angular distribution of Fermi/GBM gamma-ray bursts

    Science.gov (United States)

    Tarnopolski, M.

    2017-12-01

    Gamma-ray bursts (GRBs) were confirmed to be of extragalactic origin due to their isotropic angular distribution, combined with the fact that they exhibited an intensity distribution that deviated strongly from the -3/2 power law. This finding was later confirmed with the first redshift, equal to at least z = 0.835, measured for GRB970508. Despite this result, the data from CGRO/BATSE and Swift/BAT indicate that long GRBs are indeed distributed isotropically, but the distribution of short GRBs is anisotropic. Fermi/GBM has detected 1669 GRBs up to date, and their sky distribution is examined in this paper. A number of statistical tests are applied: nearest neighbour analysis, fractal dimension, dipole and quadrupole moments of the distribution function decomposed into spherical harmonics, binomial test and the two-point angular correlation function. Monte Carlo benchmark testing of each test is performed in order to evaluate its reliability. It is found that short GRBs are distributed anisotropically in the sky, and long ones have an isotropic distribution. The probability that these results are not a chance occurrence is equal to at least 99.98 per cent and 30.68 per cent for short and long GRBs, respectively. The cosmological context of this finding and its relation to large-scale structures is discussed.

  15. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  16. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  17. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  18. Cryogenic distribution system for ITER proto-type cryoline test

    International Nuclear Information System (INIS)

    Bhattacharya, R.; Shah, N.; Badgujar, S.; Sarkar, B.

    2012-01-01

    Design validation for ITER cryoline will be carried out by proto-type test on cryoline. The major objectives of the test will be to ensure the mechanical integrity, reliability, thermal stress and heat load as well as checking of assembly and fabrication procedures. The cryogenics system has to satisfy the functional operating scenario of the cryoline. Cryoplant, distribution box (DB) including liquid helium (LHe) tank constitute the cryogenic system for the test. Conceptual system architecture is proposed with a commercially available refrigerator/liquefier and custom designed DB housing cold compressor, cold circulator as well as phase separator with sub-merged heat exchanger. System level optimization, mainly with DB and LHe tank with options, has been studied to minimize the cold power required for the system. Aspen HYSYS is used for the purpose of process simulation. The paper describes the system architecture and the optimized design as well as process simulation with associated results. (author)

  19. Real time testing of intelligent relays for synchronous distributed generation islanding detection

    Science.gov (United States)

    Zhuang, Davy

    As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.

  20. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  1. Gene set analysis using variance component tests.

    Science.gov (United States)

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  2. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  3. Testing the Pareto against the lognormal distributions with the uniformly most powerful unbiased test applied to the distribution of cities.

    Science.gov (United States)

    Malevergne, Yannick; Pisarenko, Vladilen; Sornette, Didier

    2011-03-01

    Fat-tail distributions of sizes abound in natural, physical, economic, and social systems. The lognormal and the power laws have historically competed for recognition with sometimes closely related generating processes and hard-to-distinguish tail properties. This state-of-affair is illustrated with the debate between Eeckhout [Amer. Econ. Rev. 94, 1429 (2004)] and Levy [Amer. Econ. Rev. 99, 1672 (2009)] on the validity of Zipf's law for US city sizes. By using a uniformly most powerful unbiased (UMPU) test between the lognormal and the power-laws, we show that conclusive results can be achieved to end this debate. We advocate the UMPU test as a systematic tool to address similar controversies in the literature of many disciplines involving power laws, scaling, "fat" or "heavy" tails. In order to demonstrate that our procedure works for data sets other than the US city size distribution, we also briefly present the results obtained for the power-law tail of the distribution of personal identity (ID) losses, which constitute one of the major emergent risks at the interface between cyberspace and reality.

  4. Buffered Communication Analysis in Distributed Multiparty Sessions

    Science.gov (United States)

    Deniélou, Pierre-Malo; Yoshida, Nobuko

    Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

  5. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  6. Test Protocol for Room-to-Room Distribution of Outside Air by Residential Ventilation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Barley, C. D.; Anderson, R.; Hendron, B.; Hancock, E.

    2007-12-01

    This test and analysis protocol has been developed as a practical approach for measuring outside air distribution in homes. It has been used successfully in field tests and has led to significant insights on ventilation design issues. Performance advantages of more sophisticated ventilation systems over simpler, less-costly designs have been verified, and specific problems, such as airflow short-circuiting, have been identified.

  7. A practical test for the choice of mixing distribution in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Bierlaire, Michel

    2007-01-01

    The choice of a specific distribution for random parameters of discrete choice models is a critical issue in transportation analysis. Indeed, various pieces of research have demonstrated that an inappropriate choice of the distribution may lead to serious bias in model forecast and in the estimated...... means of random parameters. In this paper, we propose a practical test, based on seminonparametric techniques. The test is analyzed both on synthetic and real data, and is shown to be simple and powerful. (c) 2007 Elsevier Ltd. All rights reserved....

  8. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  9. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  10. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  11. Distributed training, testing, and decision aids within one solution

    Science.gov (United States)

    Strini, Robert A.; Strini, Keith

    2002-07-01

    Military air operations in the European theater require U.S. and NATO participants to send various mission experts to 10 Combined Air Operations Centers (CAOCs). Little or no training occurs prior to their arrival for tours of duty ranging between 90 days to 3 years. When training does occur, there is little assessment of its effectiveness in raising CAOC mission readiness. A comprehensive training management system has been developed that utilizes traditional and web based distance-learning methods for providing instruction and task practice as well as distributed simulation to provide mission rehearsal training opportunities on demand for the C2 warrior. This system incorporates new technologies, such as voice interaction and virtual tutors, and a Learning Management System (LMS) that tracks trainee progress from academic learning through procedural practice and mission training exercises. Supervisors can monitor their subordinate's progress through synchronous or asynchronous methods. Embedded within this system are virtual tutors, which provide automated performance measurement as well as tutoring. The training system offers a true time management savings for current instructors and training providers that today must perform On the Job Training (OJT) duties before, during and after each event. Many units do not have the resources to support OJT and are forced to maintain an overlap of several days to minimally maintain unit readiness. One CAOC Commander affected by this paradigm has advocated supporting a beta version of this system to test its ability to offer training on-demand and track the progress of its personnel and unit readiness. If successful, aircrew simulation devices can be connected through either Distributed Interactive Simulation or High Level Architecture methods to provide a DMT-C2 air operations training environment in Europe. This paper presents an approach to establishing a training, testing and decision aid capability and means to assess

  12. Attitudes towards genetic testing: analysis of contradictions

    DEFF Research Database (Denmark)

    Jallinoja, P; Hakonen, A; Aro, A R

    1998-01-01

    A survey study was conducted among 1169 people to evaluate attitudes towards genetic testing in Finland. Here we present an analysis of the contradictions detected in people's attitudes towards genetic testing. This analysis focuses on the approval of genetic testing as an individual choice and o...... studies on attitudes towards genetic testing as well as in the health care context, e.g. in genetic counselling.......A survey study was conducted among 1169 people to evaluate attitudes towards genetic testing in Finland. Here we present an analysis of the contradictions detected in people's attitudes towards genetic testing. This analysis focuses on the approval of genetic testing as an individual choice...... and on the confidence in control of the process of genetic testing and its implications. Our analysis indicated that some of the respondents have contradictory attitudes towards genetic testing. It is proposed that contradictory attitudes towards genetic testing should be given greater significance both in scientific...

  13. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  14. Development of pair distribution function analysis

    International Nuclear Information System (INIS)

    Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

    1996-01-01

    This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO 2 planes of high-T c superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-T c superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF

  15. Corroded scale analysis from water distribution pipes

    Directory of Open Access Journals (Sweden)

    Rajaković-Ognjanović Vladana N.

    2011-01-01

    Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains

  16. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  17. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  18. Effect of distributive mass of spring on power flow in engineering test

    Science.gov (United States)

    Sheng, Meiping; Wang, Ting; Wang, Minqing; Wang, Xiao; Zhao, Xuan

    2018-06-01

    Mass of spring is always neglected in theoretical and simulative analysis, while it may be a significance in practical engineering. This paper is concerned with the distributive mass of a steel spring which is used as an isolator to simulate isolation performance of a water pipe in a heating system. Theoretical derivation of distributive mass effect of steel spring on vibration is presented, and multiple eigenfrequencies are obtained, which manifest that distributive mass results in extra modes and complex impedance properties. Furthermore, numerical simulation visually shows several anti-resonances of the steel spring corresponding to impedance and power flow curves. When anti-resonances emerge, the spring collects large energy which may cause damage and unexpected consequences in practical engineering and needs to be avoided. Finally, experimental tests are conducted and results show consistency with that of the simulation of the spring with distributive mass.

  19. Experimental test of nuclear magnetization distribution and nuclear structure models

    International Nuclear Information System (INIS)

    Beirsdorfer, P; Crespo-Lopez-Urrutia, J R; Utter, S B.

    1999-01-01

    Models exist that ascribe the nuclear magnetic fields to the presence of a single nucleon whose spin is not neutralized by pairing it up with that of another nucleon; other models assume that the generation of the magnetic field is shared among some or all nucleons throughout the nucleus. All models predict the same magnetic field external to the nucleus since this is an anchor provided by experiments. The models differ, however, in their predictions of the magnetic field arrangement within the nucleus for which no data exist. The only way to distinguish which model gives the correct description of the nucleus would be to use a probe inserted into the nucleus. The goal of our project was to develop exactly such a probe and to use it to measure fundamental nuclear quantities that have eluded experimental scrutiny. The need for accurately knowing such quantities extends far beyond nuclear physics and has ramifications in parity violation experiments on atomic traps and the testing of the standard model in elementary particle physics. Unlike scattering experiments that employ streams of free particles, our technique to probe the internal magnetic field distribution of the nucleus rests on using a single bound electron. Quantum mechanics shows that an electron in the innermost orbital surrounding the nucleus constantly dives into the nucleus and thus samples the fields that exist inside. This sampling of the nucleus usually results in only minute shifts in the electron s average orbital, which would be difficult to detect. By studying two particular energy states of the electron, we can, however, dramatically enhance the effects of the distribution of the magnetic fields in the nucleus. In fact about 2% of the energy difference between the two states, dubbed the hyperfine splitting, is determined by the effects related to the distribution of magnetic fields in the nucleus, A precise measurement of this energy difference (better than 0.01%) would then allow us to place

  20. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  1. Preliminary Calculations of Bypass Flow Distribution in a Multi-Block Air Test

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Tak, Nam Il

    2011-01-01

    The development of a methodology for the bypass flow assessment in a prismatic VHTR (Very High Temperature Reactor) core has been conducted at KAERI. A preliminary estimation of variation of local bypass flow gap size between graphite blocks in the NHDD core were carried out. With the predicted gap sizes, their influence on the bypass flow distribution and the core hot spot was assessed. Due to the complexity of gap distributions, a system thermo-fluid analysis code is suggested as a tool for the core thermo-fluid analysis, the model and correlations of which should be validated. In order to generate data for validating the bypass flow analysis model, an experimental facility for a multi-block air test was constructed at Seoul National University (SNU). This study is focused on the preliminary evaluation of flow distribution in the test section to understand how the flow is distributed and to help the selection of experimental case. A commercial CFD code, ANSYS CFX is used for the analyses

  2. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  3. silicon bipolar distributed oscillator design and analysis

    African Journals Online (AJOL)

    digital and analogue market, wired or wireless is making it necessary to operate ... is generally high; this additional power is supplied by the eternal dc source. ... distributed oscillator consists of a pair of transmission lines with characteristic ...

  4. Distributed energy store railguns experiment and analysis

    International Nuclear Information System (INIS)

    Holland, L.D.

    1984-01-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. The distributed energy store railgun used multiple current sources connected to the rails of a railgun at points distributed along the bore. These current sources (energy stores) are turned on in sequence as the projectile moves down the bore so that current is fed to the railgun from behind the armature. In this system the length of the rails that carry the full armature current is less than the total length of the railgun. If a sufficient number of energy stores is used, this removes the limitation on the length of a railgun. An additional feature of distributed energy store type railguns is that they can be designed to maintain a constant pressure on the projectile being accelerated. A distributed energy store railgun was constructed and successfully operated. In addition to this first demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed

  5. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  6. Population distribution around the Nevada Test Site, 1984

    International Nuclear Information System (INIS)

    Smith, D.D.; Coogan, J.S.

    1984-08-01

    The Environmental Monitoring Systems Laboratory (EMSL-LV) conducts an offsite radiological safety program outside the boundaries of the Nevada Test Site. As part of this program, the EMSL-LV maintains a comprehensive and current listing of all rural offsite residents and dairy animals within the controllable sectors (areas where the EMSL-LV could implement protective or remedial actions that would assure public safety). This report was produced to give a brief overview of the population distribution and information on the activities within the controllable sectors. Obviously the numbers of people in a sector change dependent upon the season of the year, and such diverse information as the price of minerals which relates to the opening and closing of mining operations. Currently, the controllable sectors out to 200 kilometers from the Control Point on the NTS are considered to be the entire northeast, north-northeast, north, north-northwest, west-northwest sectors and portions of the east and east-northeast sectors. The west-southwest and south-southwest sections are considered controllable out to 40 to 80 kilometers. No major population centers or dairy farms lie within these sectors. 7 references, 5 figures, 2 tables

  7. Distribution of base rock depth estimated from Rayleigh wave measurement by forced vibration tests

    International Nuclear Information System (INIS)

    Hiroshi Hibino; Toshiro Maeda; Chiaki Yoshimura; Yasuo Uchiyama

    2005-01-01

    This paper shows an application of Rayleigh wave methods to a real site, which was performed to determine spatial distribution of base rock depth from the ground surface. At a certain site in Sagami Plain in Japan, the base rock depth from surface is assumed to be distributed up to 10 m according to boring investigation. Possible accuracy of the base rock depth distribution has been needed for the pile design and construction. In order to measure Rayleigh wave phase velocity, forced vibration tests were conducted with a 500 N vertical shaker and linear arrays of three vertical sensors situated at several points in two zones around the edges of the site. Then, inversion analysis was carried out for soil profile by genetic algorithm, simulating measured Rayleigh wave phase velocity with the computed counterpart. Distribution of the base rock depth inverted from the analysis was consistent with the roughly estimated inclination of the base rock obtained from the boring tests, that is, the base rock is shallow around edge of the site and gradually inclines towards the center of the site. By the inversion analysis, the depth of base rock was determined as from 5 m to 6 m in the edge of the site, 10 m in the center of the site. The determined distribution of the base rock depth by this method showed good agreement on most of the points where boring investigation were performed. As a result, it was confirmed that the forced vibration tests on the ground by Rayleigh wave methods can be useful as the practical technique for estimating surface soil profiles to a depth of up to 10 m. (authors)

  8. Minigenerator - Analysis, Design and Tests

    Directory of Open Access Journals (Sweden)

    Pavel Fiala

    2006-01-01

    Full Text Available The paper presents results of the analysis of the vibrational generator. The paper deals with the design of a vibrational generator that is used as a power supply for independent electric circuits. The vibrational generator can be used in the various areas, e.g. traffic, electronics, special-purpose machines, and robotics. The proposed design employs magnetic damping of the core movement. It was numerically evaluated and it was shown that it was possible to obtain significantly larger output voltage and output power than in experimental settings used previously [1].

  9. Analysis of mixed mode microwave distribution manifolds

    International Nuclear Information System (INIS)

    White, T.L.

    1982-09-01

    The 28-GHz microwave distribution manifold used in the ELMO Bumpy Torus-Scale (EBT-S) experiments consists of a toroidal metallic cavity, whose dimensions are much greater than a wavelength, fed by a source of microwave power. Equalization of the mixed mode power distribution ot the 24 cavities of EBT-S is accomplished by empirically adjusting the coupling irises which are equally spaced around the manifold. The performance of the manifold to date has been very good, yet no analytical models exist for optimizing manifold transmission efficiency or for scaling this technology to the EBT-P manifold design. The present report develops a general model for mixed mode microwave distribution manifolds based on isotropic plane wave sources of varying amplitudes that are distributed toroidally around the manifold. The calculated manifold transmission efficiency for the most recent EBT-S coupling iris modification is 90%. This agrees with the average measured transmission efficiency. Also, the model predicts the coupling iris areas required to balance the distribution of microwave power while maximizing transmission efficiency, and losses in waveguide feeds connecting the irises to the cavities of EBT are calculated using an approach similar to the calculation of mainfold losses. The model will be used to evaluate EBT-P manifold designs

  10. Flight test trajectory control analysis

    Science.gov (United States)

    Walker, R.; Gupta, N.

    1983-01-01

    Recent extensions to optimal control theory applied to meaningful linear models with sufficiently flexible software tools provide powerful techniques for designing flight test trajectory controllers (FTTCs). This report describes the principal steps for systematic development of flight trajectory controllers, which can be summarized as planning, modeling, designing, and validating a trajectory controller. The techniques have been kept as general as possible and should apply to a wide range of problems where quantities must be computed and displayed to a pilot to improve pilot effectiveness and to reduce workload and fatigue. To illustrate the approach, a detailed trajectory guidance law is developed and demonstrated for the F-15 aircraft flying the zoom-and-pushover maneuver.

  11. Distributed energy store railguns experiment and analysis

    Science.gov (United States)

    Holland, L. D.

    1984-02-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. A distributed energy storage railgun was constructed and successfully operated. In addition to this demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed. A simple simulation of the railgun system based on this model, but ignoring frictional drag, was compared with the experimental results. During the process of comparing results from the simulation and the experiment, the effect of significant frictional drag of the projectile on the sidewalls of the bore was observed.

  12. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  13. Field distribution analysis in deflecting structures

    International Nuclear Information System (INIS)

    Paramonov, V.V.

    2013-02-01

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE 1 and HM 1 . The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  14. Statistical analysis of partial reduced width distributions

    International Nuclear Information System (INIS)

    Tran Quoc Thuong.

    1973-01-01

    The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr

  15. Item Analysis in Introductory Economics Testing.

    Science.gov (United States)

    Tinari, Frank D.

    1979-01-01

    Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)

  16. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  17. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  18. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  19. Distributed crack analysis of ceramic inlays

    NARCIS (Netherlands)

    Peters, M.C.R.B.; Vree, de J.H.P.; Brekelmans, W.A.M.

    1993-01-01

    In all-ceramic restorations, crack formation and propagation phenomena are of major concern, since they may result in intra-oral fracture. The objective of this study was calculation of damage in porcelain MOD inlays by utilization of a finite-element (FE) implementation of the distributed crack

  20. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA; GENTON, MARC G.; LISEO, BRUNERO

    2012-01-01

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student's t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric

  1. Analysis of refrigerant mal-distribution

    DEFF Research Database (Denmark)

    Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...

  2. Accident analysis of HANARO fuel test loop

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Chi, D. Y

    1998-03-01

    Steady state fuel test loop will be equipped in HANARO to obtain the development and betterment of advanced fuel and materials through the irradiation tests. The HANARO fuel test loop was designed to match the CANDU and PWR fuel operating conditions. The accident analysis was performed by RELAP5/MOD3 code based on FTL system designs and determined the detail engineering specification of in-pile test section and out-pile systems. The accident analysis results of FTL system could be used for the fuel and materials designer to plan the irradiation testing programs. (author). 23 refs., 20 tabs., 178 figs.

  3. Component evaluation testing and analysis algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, Darren M.; Merchant, Bion John

    2011-10-01

    The Ground-Based Monitoring R&E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. In order to guarantee consistency, traceability, and visibility into the results of the testing process, it is necessary to document the test and analysis procedures that are in place. Other reports document the testing procedures that are in place (Kromer, 2007). This document serves to provide a comprehensive overview of the analysis and the algorithms that are applied to the Component Evaluation testing. A brief summary of each test is included to provide the context for the analysis that is to be performed.

  4. Semen analysis and sperm function tests: How much to test?

    Directory of Open Access Journals (Sweden)

    S S Vasan

    2011-01-01

    Full Text Available Semen analysis as an integral part of infertility investigations is taken as a surrogate measure for male fecundity in clinical andrology, male fertility, and pregnancy risk assessments. Clearly, laboratory seminology is still very much in its infancy. In as much as the creation of a conventional semen profile will always represent the foundations of male fertility evaluation, the 5th edition of the World Health Organization (WHO manual is a definitive statement on how such assessments should be carried out and how the quality should be controlled. A major advance in this new edition of the WHO manual, resolving the most salient critique of previous editions, is the development of the first well-defined reference ranges for semen analysis based on the analysis of over 1900 recent fathers. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Sperm function testing is used to determine if the sperm have the biologic capacity to perform the tasks necessary to reach and fertilize ova and ultimately result in live births. A variety of tests are available to evaluate different aspects of these functions. To accurately use these functional assays, the clinician must understand what the tests measure, what the indications are for the assays, and how to interpret the results to direct further testing or patient management.

  5. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum

  6. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  7. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  8. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  9. Shell model test of the Porter-Thomas distribution

    International Nuclear Information System (INIS)

    Grimes, S.M.; Bloom, S.D.

    1981-01-01

    Eigenvectors have been calculated for the A=18, 19, 20, 21, and 26 nuclei in an sd shell basis. The decomposition of these states into their shell model components shows, in agreement with other recent work, that this distribution is not a single Gaussian. We find that the largest amplitudes are distributed approximately in a Gaussian fashion. Thus, many experimental measurements should be consistent with the Porter-Thomas predictions. We argue that the non-Gaussian form of the complete distribution can be simply related to the structure of the Hamiltonian

  10. Comparisons of uniform and discrete source distributions for use in bioassay laboratory performance testing

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; MacLellan, J.A.

    1987-09-01

    The Pacific Northwest Laboratory (PNL) is sending a torso phantom with radioactive material uniformly distributed in the lungs to in vivo bioassay laboratories for analysis. Although the radionuclides ultimately chosen for the studies had relatively long half-lives, future accreditation testing will require repeated tests with short half-life test nuclides. Computer modeling was used to simulate the major components of the phantom. Radiation transport calculations were then performed using the computer models to calculate dose rates either 15 cm from the chest or at its surface. For 144 Ce and 60 Co, three configurations were used for the lung comparison tests. Calculations show that, for most detector positions, a single plug containing 40 K located in the back of the heart provides a good approximation to a uniform distribution of 40 K. The approximation would lead, however, to a positive bias for the detector reading if the detector were located at the chest surface near the center. Loading the 40 K in a uniform layer inside the chest wall is not a good approximation of the uniform distribution in the lungs, because most of the radionuclides would be situated close to the detector location and the only shielding would be the thickness of the chest wall. The calculated dose rates for 60 Co and 144 Ce were similar at all calculated reference points. 3 refs., 5 figs., 10 tabs

  11. Performance and life time test on a 5 kW SOFC system for distributed cogeneration

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Rosa; De Biase, Sabrina; Ginocchio, Stefano [Edison S.p.A, Via Giorgio La Pira, 2, 10028 Trofarello (Italy); Bedogni, Stefano; Montelatici, Lorenzo [Edison S.p.A, Foro Bonaparte 31, 20121 Milano (Italy)

    2008-06-15

    Edison R and D Centre is committed to test a wide range of commercial and prototypal fuel cell systems. The activities aim to evaluate the available state of the art of these technologies and their maturity for the relevant market. The laboratory is equipped with ad hoc test benches designed to study single cells, stacks and systems. The characterization of commercial and new generation PEMFC, also for high temperatures (160 C), together with the analysis of the behaviour of SOFC represent the core activities of the laboratory. On January 2007 a new 5 kW SOFC system supplied by Acumentrics was installed. The claimed electrical power output is 5 kW and thermal power is 3 kW. The aim of the test is the achievement of technical and economical assessment for future applications of small SOFC plants for distributed cogeneration. Performance and life time test of the system are shown. (author)

  12. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  13. A multivariate rank test for comparing mass size distributions

    KAUST Repository

    Lombard, F.; Potgieter, C. J.

    2012-01-01

    Particle size analyses of a raw material are commonplace in the mineral processing industry. Knowledge of particle size distributions is crucial in planning milling operations to enable an optimum degree of liberation of valuable mineral phases

  14. Experimental tests of charge symmetry violation in parton distributions

    International Nuclear Information System (INIS)

    Londergan, J.T.; Murdock, D.P.; Thomas, A.W.

    2005-01-01

    Recently, a global phenomenological fit to high energy data has included charge symmetry breaking terms, leading to limits on the allowed magnitude of such effects. We discuss two possible experiments that could search for isospin violation in valence parton distributions. We show that, given the magnitude of charge symmetry violation consistent with existing global data, such experiments might expect to see effects at a level of several percent. Alternatively, such experiments could significantly decrease the upper limits on isospin violation in parton distributions

  15. Final comparison report on ISP-35: Nupec hydrogen mixing and distribution test (Test M-7-1)

    International Nuclear Information System (INIS)

    1994-12-01

    This final comparison report summarizes the results of the OECD/CSNI sponsored ISP-35 exercise which was based on NUPEC's Hydrogen Mixing and Distribution Test M-7-1. 12 organizations from 10 different countries took part in the exercise. For the ISP-35 test, a steam/light gas (helium) mixture was released into the lower region of a simplified model of a PWR containment. At the same time, the dome cooling spray was also activated. the transient time histories for gas temperature and concentrations were recorded for each of the 25 compartments of the model containment. The wall temperatures as well as the dome pressure were also recorded. The ISP-35 participants simulated the test conditions and attempted to predict the time histories using their accident analysis codes. Results of these analyses are presented, and comparisons are made between the experimental data and the calculated data. In general, predictions for pressure, helium concentration and gas distribution patterns were achieved with acceptable accuracy

  16. Post-test analysis of PANDA test P4

    International Nuclear Information System (INIS)

    Hart, J.; Woudstra, A.; Koning, H.

    1999-01-01

    The results of a post-test analysis of the integral system test P4, which has been executed in the PANDA facility at PSI in Switzerland within the framework of Work Package 2 of the TEPSS project are presented. The post-test analysis comprises an evaluation of the PANDA test P4 and a comparison of the test results with the results of simulations using the RELAPS/MOD3.2, TRAC-BF1, and MELCOR 1.8.4 codes. The PANDA test P4 has provided data about how trapped air released from the drywell later in the transient affects PCCS performance in an adequate manner. The well-defined measurements can serve as an important database for the assessment of thermal hydraulic system analysis codes, especially for conditions that could be met in passively operated advanced reactors, i.e. low pressure and small driving forces. Based on the analysis of the test data, the test acceptance criteria have been met. The test P4 has been successfully completed and the instrument readings were with the permitted ranges. The PCCs showed a favorable and robust performance and a wide margin for decay heat removal from the containment. The PANDA P4 test demonstrated that trapped air, released from the drywell later in the transient, only temporarily and only slightly affected the performance of the passive containment cooling system. The analysis of the results of the RELAPS code showed that the overall behaviour of the test has been calculated quite well with regards to pressure, mass flow rates, and pool boil-down. This accounts both for the pre-test and the post-test simulations. However, due to the one-dimensional, stacked-volume modeling of the PANDA DW, WW, and GDCS vessels, 3D-effects such as in-vessel mixing and recirculation could not be calculated. The post-test MELCOR simulation showed an overall behaviour that is comparable to RELAPS. However, MELCOR calculated almost no air trapping in the PCC tubes that could hinder the steam condensation rate. This resulted in lower calculated

  17. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  18. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  19. Analysis of Peach Bottom turbine trip tests

    International Nuclear Information System (INIS)

    Cheng, H.S.; Lu, M.S.; Hsu, C.J.; Shier, W.G.; Diamond, D.J.; Levine, M.M.; Odar, F.

    1979-01-01

    Current interest in the analysis of turbine trip transients has been generated by the recent tests performed at the Peach Bottom (Unit 2) reactor. Three tests, simulating turbine trip transients, were performed at different initial power and coolant flow conditions. The data from these tests provide considerable information to aid qualification of computer codes that are currently used in BWR design analysis. The results are presented of an analysis of a turbine trip transient using the RELAP-3B and the BNL-TWIGL computer codes. Specific results are provided comparing the calculated reactor power and system pressures with the test data. Excellent agreement for all three test transients is evident from the comparisons

  20. An Analysis of Rocket Propulsion Testing Costs

    Science.gov (United States)

    Ramirez, Carmen; Rahman, Shamim

    2010-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is commonly characterized as one of two types: production testing for certification and acceptance of engine hardware, and developmental testing for prototype evaluation or research and development (R&D) purposes. For programmatic reasons there is a continuing need to assess and evaluate the test costs for the various types of test campaigns that involve liquid rocket propellant test articles. Presently, in fact, there is a critical need to provide guidance on what represents a best value for testing and provide some key economic insights for decision-makers within NASA and the test customers outside the Agency. Hence, selected rocket propulsion test databases and references have been evaluated and analyzed with the intent to discover correlations of technical information and test costs that could help produce more reliable and accurate cost projections in the future. The process of searching, collecting, and validating propulsion test cost information presented some unique obstacles which then led to a set of recommendations for improvement in order to facilitate future cost information gathering and analysis. In summary, this historical account and evaluation of rocket propulsion test cost information will enhance understanding of the various kinds of project cost information; identify certain trends of interest to the aerospace testing community.

  1. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  2. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  3. Advanced CMOS Radiation Effects Testing and Analysis

    Science.gov (United States)

    Pellish, J. A.; Marshall, P. W.; Rodbell, K. P.; Gordon, M. S.; LaBel, K. A.; Schwank, J. R.; Dodds, N. A.; Castaneda, C. M.; Berg, M. D.; Kim, H. S.; hide

    2014-01-01

    Presentation at the annual NASA Electronic Parts and Packaging (NEPP) Program Electronic Technology Workshop (ETW). The material includes an update of progress in this NEPP task area over the past year, which includes testing, evaluation, and analysis of radiation effects data on the IBM 32 nm silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The testing was conducted using test vehicles supplied by directly by IBM.

  4. The PUMA test program and data analysis

    International Nuclear Information System (INIS)

    Han, J.T.; Morrison, D.L.

    1997-01-01

    The PUMA test program is sponsored by the U.S. Nuclear Regulatory Commission to provide data that are relevant to various Boiling Water Reactor phenomena. The author briefly describes the PUMA test program and facility, presents the objective of the program, provides data analysis for a large-break loss-of-coolant accident test, and compares the data with a RELAP5/MOD 3.1.2 calculation

  5. Assessment of the Nevada Test Site as a Site for Distributed Resource Testing and Project Plan: March 2002

    Energy Technology Data Exchange (ETDEWEB)

    Horgan, S.; Iannucci, J.; Whitaker, C.; Cibulka, L.; Erdman, W.

    2002-05-01

    The objective of this project was to evaluate the Nevada Test Site (NTS) as a location for performing dedicated, in-depth testing of distributed resources (DR) integrated with the electric distribution system. In this large scale testing, it is desired to operate multiple DRs and loads in an actual operating environment, in a series of controlled tests to concentrate on issues of interest to the DR community. This report includes an inventory of existing facilities at NTS, an assessment of site attributes in relation to DR testing requirements, and an evaluation of the feasibility and cost of upgrades to the site that would make it a fully qualified DR testing facility.

  6. Improved Testing of Distributed Lag Model in Presence of ...

    African Journals Online (AJOL)

    The finite distributed lag models (DLM) are often used in econometrics and statistics. Application of the ordinary least square (OLS) directly on the DLM for estimation may have serious problems. To overcome these problems, some alternative estimation procedures are available in the literature. One popular method to ...

  7. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  8. 10 CFR 431.193 - Test procedures for measuring energy consumption of distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Test procedures for measuring energy consumption of distribution transformers. 431.193 Section 431.193 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY... § 431.193 Test procedures for measuring energy consumption of distribution transformers. The test...

  9. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  10. Summary of CPAS EDU Testing Analysis Results

    Science.gov (United States)

    Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose

    2015-01-01

    The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.

  11. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  12. Overview of Current Turbine Aerodynamic Analysis and Testing at MSFC

    Science.gov (United States)

    Griffin, Lisa W.; Hudson, Susan T.; Zoladz, Thomas F.

    1999-01-01

    An overview of the current turbine aerodynamic analysis and testing activities at NASA/Marshall Space Flight Center (MSFC) is presented. The presentation is divided into three areas. The first area is the three-dimensional (3D), unsteady Computational Fluid Dynamics (CFD) analysis of the Fastrac turbine. Results from a coupled nozzle, blade, and exit guide vane analysis and from an uncoupled nozzle and coupled blade and exit guide vane will be presented. Unsteady pressure distributions, frequencies, and exit profiles from each analysis will be compared and contrasted. The second area is the testing and analysis of the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP) turbine with instrumented first stage blades. The SSME HPFTP turbine was tested in air at the MSFC Turbine Test Equipment (TTE). Pressure transducers were mounted on the first stage blades. Unsteady, 3D CFD analysis was performed for this geometry and flow conditions. A sampling of the results will be shown. The third area is a status of the Turbine Performance Optimization task. The objective of this task is to improve the efficiency of a turbine for potential use on a next generation launch vehicle. This task includes global optimization for the preliminary design, detailed optimization for blade shapes and spacing, and application of advanced CFD analysis. The final design will be tested in the MSFC TTE.

  13. Simulating distributed reinforcement effects in concrete analysis

    International Nuclear Information System (INIS)

    Marchertas, A.H.

    1985-01-01

    The effect of the bond slip is brought into the TEMP-STRESS finite element code by relaxing the equal strain condition between concrete and reinforcement. This is done for the elements adjacent to the element which is cracked. A parabolic differential strain variation is assumed along the reinforcement from the crack, which is taken to be at the centroid of the cracked element, to the point where perfect bonding exists. This strain relationship is used to increase the strain of the reinforcement in the as yet uncracked elements located adjacent to a crack. By the same token the corresponding concrete strain is decreased. This estimate is made assuming preservation of strain energy in the element. The effectiveness of the model is shown by examples. Comparison of analytical results is made with structural test data. The influence of the bonding model on cracking is portrayed pictorially. 5 refs., 6 figs

  14. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  15. Formability analysis of sheet metals by cruciform testing

    Science.gov (United States)

    Güler, B.; Alkan, K.; Efe, M.

    2017-09-01

    Cruciform biaxial tests are increasingly becoming popular for testing the formability of sheet metals as they achieve frictionless, in-plane, multi-axial stress states with a single sample geometry. However, premature fracture of the samples during testing prevents large strain deformation necessary for the formability analysis. In this work, we introduce a miniature cruciform sample design (few mm test region) and a test setup to achieve centre fracture and large uniform strains. With its excellent surface finish and optimized geometry, the sample deforms with diagonal strain bands intersecting at the test region. These bands prevent local necking and concentrate the strains at the sample centre. Imaging and strain analysis during testing confirm the uniform strain distributions and the centre fracture are possible for various strain paths ranging from plane-strain to equibiaxial tension. Moreover, the sample deforms without deviating from the predetermined strain ratio at all test conditions, allowing formability analysis under large strains. We demonstrate these features of the cruciform test for three sample materials: Aluminium 6061-T6 alloy, DC-04 steel and Magnesium AZ31 alloy, and investigate their formability at both the millimetre scale and the microstructure scale.

  16. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  17. A new technique for testing distribution of knowledge and to estimate sampling sufficiency in ethnobiology studies.

    Science.gov (United States)

    Araújo, Thiago Antonio Sousa; Almeida, Alyson Luiz Santos; Melo, Joabe Gomes; Medeiros, Maria Franco Trindade; Ramos, Marcelo Alves; Silva, Rafael Ricardo Vasconcelos; Almeida, Cecília Fátima Castelo Branco Rangel; Albuquerque, Ulysses Paulino

    2012-03-15

    We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data.

  18. An Analysis of Rocket Propulsion Testing Costs

    Science.gov (United States)

    Ramirez-Pagan, Carmen P.; Rahman, Shamim A.

    2009-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to

  19. Post-test analysis for the APR1400 LBLOCA DVI performance test using MARS

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Lee, Y. J.; Kim, H. C.; Bae, Y. Y.; Park, J. K.; Lee, W.

    2002-03-01

    Post-test analyses using a multi-dimensional best-estimate analysis code, MARS, are performed for the APR1400 LBLOCA DVI (Direct Vessel Injection) performance tests. This report describes the code evaluation results for the test data of various void height tests and direct bypass tests that have been performed at MIDAS test facility. MIDAS is a scaled test facility of APR1400 with the objective of identifying multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. A modified linear scale ratio was applied in its construction and test conditions. The major thermal-hydraulic parameters such as ECC bypass fraction, steam condensation fraction, and temperature distributions in downcomer are compared and evaluated. The evaluation results of MARS code for the various test cases show that: (a) MARS code has an advanced modeling capability of well predicting major multi-dimensional thermal-hydraulic phenomena occurring in the downcomer, (b) MARS code under-predicts the steam condensation rates, which in turn causes to over-predict the ECC bypass rates. However, the trend of decrease in steam condensation rate and increase in ECC bypass rate in accordance with the increase in steam flow rate, and the calculation results of the ECC bypass rates under the EM analysis conditions generally agree with the test data

  20. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  1. Development and Field Test of Voltage VAR Optimization in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-02-01

    Full Text Available This paper is a summary of the development and demonstration of an optimization program, voltage VAR optimization (VVO, in the Korean Smart Distribution Management System (KSDMS. KSDMS was developed to address the lack of receptivity of distributed generators (DGs, standardization and compatibility, and manual failure recovery in the existing Korean automated distribution system. Focusing on the lack of receptivity of DGs, we developed a real-time system analysis and control program. The KSDMS VVO enhances manual system operation of the existing distribution system and provides a solution with all control equipment operated at a system level. The developed VVO is an optimal power flow (OPF method that resolves violations, minimizes switching costs, and minimizes loss, and its function can vary depending on the operator’s command. The sequential mixed integer linear programming (SMILP method was adopted to find the solution of the OPF. We tested the precision of the proposed VVO on selected simulated systems and its applicability to actual systems at two substations on the Jeju Island. Running the KSDMS VVO on a regular basis improved system stability, and it also raised no issues regarding its applicability to actual systems.

  2. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  3. Choreographer Pre-Testing Code Analysis and Operational Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, David J. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Harrison, Christopher B. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Perr, C. W. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Hurd, Steven A [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2014-07-01

    Choreographer is a "moving target defense system", designed to protect against attacks aimed at IP addresses without corresponding domain name system (DNS) lookups. It coordinates actions between a DNS server and a Network Address Translation (NAT) device to regularly change which publicly available IP addresses' traffic will be routed to the protected device versus routed to a honeypot. More details about how Choreographer operates can be found in Section 2: Introducing Choreographer. Operational considerations for the successful deployment of Choreographer can be found in Section 3. The Testing & Evaluation (T&E) for Choreographer involved 3 phases: Pre-testing, Code Analysis, and Operational Testing. Pre-testing, described in Section 4, involved installing and configuring an instance of Choreographer and verifying it would operate as expected for a simple use case. Our findings were that it was simple and straightforward to prepare a system for a Choreographer installation as well as configure Choreographer to work in a representative environment. Code Analysis, described in Section 5, consisted of running a static code analyzer (HP Fortify) and conducting dynamic analysis tests using the Valgrind instrumentation framework. Choreographer performed well, such that only a few errors that might possibly be problematic in a given operating situation were identified. Operational Testing, described in Section 6, involved operating Choreographer in a representative environment created through EmulyticsTM . Depending upon the amount of server resources dedicated to Choreographer vis-á-vis the amount of client traffic handled, Choreographer had varying degrees of operational success. In an environment with a poorly resourced Choreographer server and as few as 50-100 clients, Choreographer failed to properly route traffic over half the time. Yet, with a well-resourced server, Choreographer handled over 1000 clients without missrouting. Choreographer

  4. Uncovering Bugs in Distributed Storage Systems during Testing (not in Production!)

    OpenAIRE

    Deligiannis, P; McCutchen, M; Thomson, P; Chen, S; Donaldson, AF; Erickson, J; Huang, C; Lal, A; Mudduluru, R; Qadeer, S; Schulte, W

    2016-01-01

    Testing distributed systems is challenging due to multiple sources of nondeterminism. Conventional testing techniques, such as unit, integration and stress testing, are ineffective in preventing serious but subtle bugs from reaching production. Formal techniques, such as TLA+, can only verify high-level specifications of systems at the level of logic-based models, and fall short of checking the actual executable code. In this paper, we present a new methodology for testing distributed systems...

  5. Project W-320 acceptance test report for AY-farm electrical distribution

    International Nuclear Information System (INIS)

    Bevins, R.R.

    1998-01-01

    This Acceptance Test Procedure (ATP) has been prepared to demonstrate that the AY-Farm Electrical Distribution System functions as required by the design criteria. This test is divided into three parts to support the planned construction schedule; Section 8 tests Mini-Power Pane AY102-PPI and the EES; Section 9 tests the SSS support systems; Section 10 tests the SSS and the Multi-Pak Group Control Panel. This test does not include the operation of end-use components (loads) supplied from the distribution system. Tests of the end-use components (loads) will be performed by other W-320 ATPs

  6. Real-time flight test data distribution and display

    Science.gov (United States)

    Nesel, Michael C.; Hammons, Kevin R.

    1988-01-01

    Enhancements to the real-time processing and display systems of the NASA Western Aeronautical Test Range are described. Display processing has been moved out of the telemetry and radar acquisition processing systems super-minicomputers into user/client interactive graphic workstations. Real-time data is provided to the workstations by way of Ethernet. Future enhancement plans include use of fiber optic cable to replace the Ethernet.

  7. TESTING THE GRAIN-SIZE DISTRIBUTION DETERMINED BY LASER DIFFRACTOMETRY FOR SICILIAN SOILS

    Directory of Open Access Journals (Sweden)

    Costanza Di Stefano

    2012-06-01

    Full Text Available In this paper the soil grain-size distribution determined by Laser Diffraction method (LDM is tested using the Sieve-Hydrometer method (SHM applied for 747 soil samples representing a different texture classification, sampled in Sicily. 005_Di_Stefano(599_39 28-12-2011 15:01 Pagina 45 The analysis showed that the sand content measured by SHM can be assumed equal to the one determined by LDM. An underestimation of the clay fraction measured by LDM was obtained with respect to the SHM and a set of equations useful to refer laser diffraction measurements to SHM was calibrated using the measurements carried out for 635 soil samples. Finally, the proposed equations were tested using independent measurements carried out by LDM and SHM for 112 soil samples with a different texture classification.

  8. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  9. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  10. Asymptotically Distribution-Free Goodness-of-Fit Testing for Copulas

    NARCIS (Netherlands)

    Can, S.U.; Einmahl, John; Laeven, R.J.A.

    2017-01-01

    Consider a random sample from a continuous multivariate distribution function F with copula C. In order to test the null hypothesis that C belongs to a certain parametric family, we construct an under H0 asymptotically distribution-free process that serves as a tests generator. The process is a

  11. Distributed storage and cloud computing: a test case

    International Nuclear Information System (INIS)

    Piano, S; Ricca, G Delia

    2014-01-01

    Since 2003 the computing farm hosted by the INFN Tier3 facility in Trieste supports the activities of many scientific communities. Hundreds of jobs from 45 different VOs, including those of the LHC experiments, are processed simultaneously. Given that normally the requirements of the different computational communities are not synchronized, the probability that at any given time the resources owned by one of the participants are not fully utilized is quite high. A balanced compensation should in principle allocate the free resources to other users, but there are limits to this mechanism. In fact, the Trieste site may not hold the amount of data needed to attract enough analysis jobs, and even in that case there could be a lack of bandwidth for their access. The Trieste ALICE and CMS computing groups, in collaboration with other Italian groups, aim to overcome the limitations of existing solutions using two approaches: sharing the data among all the participants taking full advantage of GARR-X wide area networks (10 GB/s) and integrating the resources dedicated to batch analysis with the ones reserved for dynamic interactive analysis, through modern solutions as cloud computing.

  12. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  13. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  14. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  15. Thermal test and analysis of a spent fuel storage cask

    International Nuclear Information System (INIS)

    Yamakawa, H.; Gomi, Y.; Ozaki, S.; Kosaki, A.

    1993-01-01

    A thermal test simulated with full-scale cask model for the normal storage was performed to verify the storage skill of the spent fuels of the cask. The maximum temperature at each point in the test was lower than the allowable temperature. The integrity of the cask was maintained. It was observed that the safety of containment system was also kept according to the check of the seal before and after the thermal test. Therefore it was shown that using the present skill, it is possible to store spent fuels in the dry-type cask safely. Moreover, because of the good agreement between analysis and experimental results, it was shown that the analysis model was successfully established to estimate the temperature distribution of the fuel cladding and the seal portion. (J.P.N.)

  16. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  17. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  18. Thermal Analysis of Bending Under Tension Test

    DEFF Research Database (Denmark)

    Ceron, Ermanno; Martins, Paulo A.F.; Bay, Niels

    2014-01-01

    during testing is similar to the one in the production tool. A universal sheet tribo-tester has been developed, which can run multiple tests automatically from coil. This allows emulating the temperature increase as in production. The present work performs finite element analysis of the evolution......The tribological conditions in deep drawing can be simulated in the Bending Under Tension test to evaluate the performance of new lubricants, tool materials, etc. Deep drawing production with automatic handling runs normally at high rate. This implies considerable heating of the tools, which...... sometimes can cause lubricant film breakdown and galling. In order to replicate the production conditions in bending under tension testing it is thus important to control the tool/workpiece interface temperature. This can be done by pre-heating the tool, but it is essential that the interface temperature...

  19. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  20. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  1. A CLASS OF DISTRIBUTION-FREE TESTS FOR INDEPENDENCE AGAINST POSITIVE QUADRANT DEPENDENCE

    Directory of Open Access Journals (Sweden)

    Parameshwar V Pandit

    2014-02-01

    Full Text Available A class of distribution-free tests based on convex combination of two U-statistics is considered for testing independence against positive quadrant dependence. The class of tests proposed by Kochar and Gupta (1987 and Kendall’s test are members of the proposed class. The performance of the proposed class is evaluated in terms of Pitman asymptotic relative efficiency for Block- Basu (1974 model and Woodworth family of distributions. It has been observed that some members of the class perform better than the existing tests in the literature.  Unbiasedness and consistency of the proposed class of tests have been established.

  2. Extension of the pseudo dynamic method to test structures with distributed mass

    International Nuclear Information System (INIS)

    Renda, V.; Papa, L.; Bellorini, S.

    1993-01-01

    The PsD method is a mixed numerical and experimental procedure. At each time step the dynamic deformation of the structure, computed by solving the equation of the motion for a given input signal, is reproduced in the laboratory by means of actuators attached to the sample at specific points. The reaction forces at those points are measured and used to compute the deformation for the next time step. The reaction forces being known, knowledge of the stiffness of the structure is not needed, so that the method can be effective also for deformations leading to strong nonlinear behaviour of the structure. On the contrary, the mass matrix and the applied forces must be well known. For this reason the PsD method can be applied without approximations when the masses can be considered as lumped at the testing points of the sample. The present work investigates the possibility to extend the PsD method to test structures with distributed mass. A standard procedure is proposed to provide an equivalent mass matrix and force vector reduced to the testing points and to verify the reliability of the model. The verification is obtained comparing the results of multi-degrees of freedom dynamic analysis, done by means of a Finite Elements (FE) numerical program, with a simulation of the PsD method based on the reduced degrees of freedom mass matrix and external forces, assuming in place of the experimental reactions, those computed with the general FE model. The method has been applied to a numerical simulation of the behaviour of a realistic and complex structure with distributed mass consisting of a masonry building of two floors. The FE model consists of about two thousand degrees of freedom and the condensation has been made for four testing points. A dynamic analysis has been performed with the general FE model and the reactions of the structure have been recorded in a file and used as input for the PsD simulation with the four degree of freedom model. The comparison between

  3. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  4. A statistical test for outlier identification in data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-09-01

    Full Text Available In the use of peer group data to assess individual, typical or best practice performance, the effective detection of outliers is critical for achieving useful results. In these ‘‘deterministic’’ frontier models, statistical theory is now mostly available. This paper deals with the statistical pared sample method and its capability of detecting outliers in data envelopment analysis. In the presented method, each observation is deleted from the sample once and the resulting linear program is solved, leading to a distribution of efficiency estimates. Based on the achieved distribution, a pared test is designed to identify the potential outlier(s. We illustrate the method through a real data set. The method could be used in a first step, as an exploratory data analysis, before using any frontier estimation.

  5. Performance Analysis of the Consensus-Based Distributed LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Gonzalo Mateos

    2009-01-01

    Full Text Available Low-cost estimation of stationary signals and reduced-complexity tracking of nonstationary processes are well motivated tasks than can be accomplished using ad hoc wireless sensor networks (WSNs. To this end, a fully distributed least mean-square (D-LMS algorithm is developed in this paper, in which sensors exchange messages with single-hop neighbors to consent on the network-wide estimates adaptively. The novel approach does not require a Hamiltonian cycle or a special bridge subset of sensors, while communications among sensors are allowed to be noisy. A mean-square error (MSE performance analysis of D-LMS is conducted in the presence of a time-varying parameter vector, which adheres to a first-order autoregressive model. For sensor observations that are related to the parameter vector of interest via a linear Gaussian model and after adopting simplifying independence assumptions, exact closed-form expressions are derived for the global and sensor-level MSE evolution as well as its steady-state (s.s. values. Mean and MSE-sense stability of D-LMS are also established. Interestingly, extensive numerical tests demonstrate that for small step-sizes the results accurately extend to the pragmatic setting whereby sensors acquire temporally correlated, not necessarily Gaussian data.

  6. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  7. Experimental investigation of localized stress-induced leakage current distribution in gate dielectrics using array test circuit

    Science.gov (United States)

    Park, Hyeonwoo; Teramoto, Akinobu; Kuroda, Rihito; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Localized stress-induced leakage current (SILC) has become a major problem in the reliability of flash memories. To reduce it, clarifying the SILC mechanism is important, and statistical measurement and analysis have to be carried out. In this study, we applied an array test circuit that can measure the SILC distribution of more than 80,000 nMOSFETs with various gate areas at a high speed (within 80 s) and a high accuracy (on the 10-17 A current order). The results clarified that the distributions of localized SILC in different gate areas follow a universal distribution assuming the same SILC defect density distribution per unit area, and the current of localized SILC defects does not scale down with the gate area. Moreover, the distribution of SILC defect density and its dependence on the oxide field for measurement (E OX-Measure) were experimentally determined for fabricated devices.

  8. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  9. HERBE- Analysis of test operation results

    International Nuclear Information System (INIS)

    Pesic, M. et.al.

    1991-01-01

    This document is part of the safety analyses performed for the RB reactor operation with the coupled fast-thermal system HERBE and is part of the final safety report together with the 'Report on test operation of HERBE for the period Dec. 15 1989 - May 15 1990. This report covers the following main topics: determination of reactivity variations dependent on the variations moderator critical level; determination of reactivity for the flooded neutron converter; and the accident analysis of neutron converter flooding

  10. VETA-I x ray test analysis

    Science.gov (United States)

    Brissenden, R. J. V.; Chartas, G.; Freeman, M. D.; Hughes, J. P.; Kellogg, E. M.; Podgorski, W. A.; Schwartz, D. A.; Zhao, P.

    1992-01-01

    This interim report presents some definitive results from our analysis of the VETA-I x-ray testing data. It also provides a description of the hardware and software used in the conduct of the VETA-I x-ray test program performed at the MSFC x-ray Calibration Facility (XRCF). These test results also serve to supply data and information to include in the TRW final report required by DPD 692, DR XC04. To provide an authoritative compendium of results, we have taken nine papers as published in the SPIE Symposium, 'Grazing Incidence X-ray/EUV Optics for Astronomy and Projection Lithography' and have reproduced them as the content of this report.

  11. Cross wavelet analysis: significance testing and pitfalls

    Directory of Open Access Journals (Sweden)

    D. Maraun

    2004-01-01

    Full Text Available In this paper, we present a detailed evaluation of cross wavelet analysis of bivariate time series. We develop a statistical test for zero wavelet coherency based on Monte Carlo simulations. If at least one of the two processes considered is Gaussian white noise, an approximative formula for the critical value can be utilized. In a second part, typical pitfalls of wavelet cross spectra and wavelet coherency are discussed. The wavelet cross spectrum appears to be not suitable for significance testing the interrelation between two processes. Instead, one should rather apply wavelet coherency. Furthermore we investigate problems due to multiple testing. Based on these results, we show that coherency between ENSO and NAO is an artefact for most of the time from 1900 to 1995. However, during a distinct period from around 1920 to 1940, significant coherency between the two phenomena occurs.

  12. Testing iOS apps with HadoopUnit rapid distributed GUI testing

    CERN Document Server

    Tilley, Scott

    2014-01-01

    Smartphone users have come to expect high-quality apps. This has increased the importance of software testing in mobile software development. Unfortunately, testing apps-particularly the GUI-can be very time-consuming. Exercising every user interface element and verifying transitions between different views of the app under test quickly becomes problematic. For example, execution of iOS GUI test suites using Apple's UI Automation framework can take an hour or more if the app's interface is complicated. The longer it takes to run a test, the less frequently the test can be run, which in turn re

  13. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  14. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  15. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  16. A Lego Mindstorms NXT based test bench for multiagent exploratory systems and distributed network partitioning

    Science.gov (United States)

    Patil, Riya Raghuvir

    Networks of communicating agents require distributed algorithms for a variety of tasks in the field of network analysis and control. For applications such as swarms of autonomous vehicles, ad hoc and wireless sensor networks, and such military and civilian applications as exploring and patrolling a robust autonomous system that uses a distributed algorithm for selfpartitioning can be significantly helpful. A single team of autonomous vehicles in a field may need to self-dissemble into multiple teams, conducive to completing multiple control tasks. Moreover, because communicating agents are subject to changes, namely, addition or failure of an agent or link, a distributed or decentralized algorithm is favorable over having a central agent. A framework to help with the study of self-partitioning of such multi agent systems that have most basic mobility model not only saves our time in conception but also gives us a cost effective prototype without negotiating the physical realization of the proposed idea. In this thesis I present my work on the implementation of a flexible and distributed stochastic partitioning algorithm on the LegoRTM Mindstorms' NXT on a graphical programming platform using National Instruments' LabVIEW(TM) forming a team of communicating agents via NXT-Bee radio module. We single out mobility, communication and self-partition as the core elements of the work. The goal is to randomly explore a precinct for reference sites. Agents who have discovered the reference sites announce their target acquisition to form a network formed based upon the distance of each agent with the other wherein the self-partitioning begins to find an optimal partition. Further, to illustrate the work, an experimental test-bench of five Lego NXT robots is presented.

  17. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  18. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  19. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  20. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  1. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  2. Thermomechanical analysis of the DFLL test blanket module for ITER

    International Nuclear Information System (INIS)

    Chen Hongli; Wu Yican; Bai Yunqing

    2006-01-01

    The finite element code is used to simulate two kinds of blanket design structure, which are SLL (Quasi-Static Lithium Lead) and DLL (Dual-cooled Lithium Lead) blanket concepts for the Dual Functional Lithium Lead-Test Blanket Module (DFLL-TBM) submitted to the ITER test blanket working group. The temperature and stress distributions have been presented for the two kinds of blanket structure on the basis of the structural design, thermal-hydraulic design and neutronics analysis. Also the mechanical performance is presented for the high temperature component of blanket structure according to the ITER Structural Design Criteria (ISDC). The rationality and feasibility of the two kinds of blanket structure design of DFLL-TBM have been analyzed based on the above results which also acted as the theoretical base for further optimized analysis. (authors)

  3. Basic distribution free identification tests for small size samples of environmental data

    International Nuclear Information System (INIS)

    Federico, A.G.; Musmeci, F.

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data [it

  4. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  5. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  6. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  7. Stress analysis of shear/compression test

    International Nuclear Information System (INIS)

    Nishijima, S.; Okada, T.; Ueno, S.

    1997-01-01

    Stress analysis has been made on the glass fiber reinforced plastics (GFRP) subjected to the combined shear and compression stresses by means of finite element method. The two types of experimental set up were analyzed, that is parallel and series method where the specimen were compressed by tilted jigs which enable to apply the combined stresses, to the specimen. Modified Tsai-Hill criterion was employed to judge the failure under the combined stresses that is the shear strength under the compressive stress. The different failure envelopes were obtained between the two set ups. In the parallel system the shear strength once increased with compressive stress then decreased. On the contrary in the series system the shear strength decreased monotonicly with compressive stress. The difference is caused by the different stress distribution due to the different constraint conditions. The basic parameters which control the failure under the combined stresses will be discussed

  8. Impact of peak electricity demand in distribution grids: a stress test

    NARCIS (Netherlands)

    Hoogsteen, Gerwin; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria; Schuring, Friso; Kootstra, Ben

    2015-01-01

    The number of (hybrid) electric vehicles is growing, leading to a higher demand for electricity in distribution grids. To investigate the effects of the expected peak demand on distribution grids, a stress test with 15 electric vehicles in a single street is conducted and described in this paper.

  9. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  10. Analysis of Static Load Test of a Masonry Arch Bridge

    Science.gov (United States)

    Shi, Jing-xian; Fang, Tian-tian; Luo, Sheng

    2018-03-01

    In order to know whether the carrying capacity of the masonry arch bridge built in the 1980s on the shipping channel entering and coming out of the factory of a cement company can meet the current requirements of Level II Load of highway, through the equivalent load distribution of the test vehicle according to the current design specifications, this paper conducted the load test, evaluated the bearing capacity of the in-service stone arch bridge, and made theoretical analysis combined with Midas Civil. The results showed that under the most unfavorable load conditions the measured strain and deflection of the test sections were less than the calculated values, the bridge was in the elastic stage under the design load; the structural strength and stiffness of the bridge had a certain degree of prosperity, and under the in the current conditions of Level II load of highway, the bridge structure was in a safe state.

  11. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  12. Preliminary investigation on determination of radionuclide distribution in field tracing test site

    International Nuclear Information System (INIS)

    Tanaka, Tadao; Mukai, Masayuki; Takebe, Shinichi; Guo Zede; Li Shushen; Kamiyama, Hideo.

    1993-12-01

    Field tracing tests for radionuclide migration have been conducted by using 3 H, 60 Co, 85 Sr and 134 Cs, in the natural unsaturated loess zone at field test site of China Institute for Radiation Protection. It is necessary to obtain confidable distribution data of the radionuclides in the test site, in order to evaluate exactly the migration behavior of the radionuclides in situ. An available method to determine the distribution was proposed on the basis of preliminary discussing results on sampling method of soils from the test site and analytical method of radioactivity in the soils. (author)

  13. RELAP5 analysis of PACTEL injection tests

    International Nuclear Information System (INIS)

    Kimber, G.R.; Lillington, J.N.

    2000-01-01

    A characteristic feature of advanced reactor designs is their reliance on passive safety systems. It is important to assess both the operation of such systems and the ability of systems codes, such as RELAP5, to model them. In Finland VTT Energy, together with Lappeenranta University of Technology, is using the PACTEL facility for the investigation of passive core cooling systems. In particular, a core make-up tank (CMT) has been installed in the rig to operate in a similar manner to those in many Advanced PWR designs. Three small break tests, GDE-24, GDE-34 and GDE-43 in the PACTEL facility were chosen for modelling with RELAP5. The objective of GDE-24 was to investigate CMT behaviour and in particular the effects of condensation in the CMT. The second test, GDE-34, was similar except that it had a smaller CMT and at the start of the test the water in the CMT and connecting pipework was at an elevated temperature. Test GDE-43 focused on conditions when the driving force for flow through the passive system injection system (PSIS) slowly disappears. Analysis of all tests reported here was carried out with RELAP5/MOD 3.2.1.2. The paper summarises the conclusions of all the tests. A critical part of the study revolved around modelling of the CMT. A model was developed to allow its detailed behaviour to be investigated more easily. This enabled recommendations for improving the condensation modelling in RELAP5 to be made. Apart from the wall condensation modelling issue, the implication of the work is that RELAP5/MOD 3.2.1.2 (a comparatively recent version of the code) is broadly adequate for these applications. (author)

  14. Dynamics of railway bridges, analysis and verification by field tests

    Directory of Open Access Journals (Sweden)

    Andersson Andreas

    2015-01-01

    Full Text Available The following paper discusses different aspects of railway bridge dynamics, comprising analysis, modelling procedures and experimental testing. The importance of realistic models is discussed, especially regarding boundary conditions, load distribution and soil-structure interaction. Two theoretical case studies are presented, involving both deterministic and probabilistic assessment of a large number of railway bridges using simplified and computationally efficient models. A total of four experimental case studies are also introduced, illustrating different aspects and phenomena in bridge dynamics. The excitation consists of both ambient vibrations, train induced vibrations, free vibrations after train passages and controlled forced excitation.

  15. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  16. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  17. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  18. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  19. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  20. Testing and Analysis of Sensor Ports

    Science.gov (United States)

    Zhang, M.; Frendi, A.; Thompson, W.; Casiano, M. J.

    2016-01-01

    This Technical Publication summarizes the work focused on the testing and analysis of sensor ports. The tasks under this contract were divided into three areas: (1) Development of an Analytical Model, (2) Conducting a Set of Experiments, and (3) Obtaining Computational Solutions. Results from the experiment using both short and long sensor ports were obtained using harmonic, random, and frequency sweep plane acoustic waves. An amplification factor of the pressure signal between the port inlet and the back of the port is obtained and compared to models. Comparisons of model and experimental results showed very good agreement.

  1. WES: A well test analysis expert system

    International Nuclear Information System (INIS)

    Mensch, A.

    1988-06-01

    This report describes part of the development of an expert system in the domain of well-test analysis. This work has been done during my final internship, completed at the Lawrence Berkeley Laboratory. The report is divided in three parts: the first one gives a description of the state of the project at the time I first began to work on it, and raises some problems that have to be solved. The second section shows the results that have been reached, and the last one draws conclusions from these results and proposes extensions that would be useful in the future

  2. Test and Analysis of Metallurgical Converter Equipment

    Directory of Open Access Journals (Sweden)

    Shan Pang

    2013-05-01

    Full Text Available Oxygen top-blow converter is the main equipment in steel making, and its work reliability decides the security and economy of steel production. Therefore, how to design and test analysis of convertor has been an important subject of industry research. Geometric modelling and structure analysis of converter tilting device by using Pro/E program .The design Principle, basic design structure were analyzed in detail. The computer simulation software of metallurgical converter equipment and how to use it were introduced .It developed by VC++ software. The position of barycentre and moment curve in No.3 and No.4 are calculated. The converter acceleration down dip can be resolved by comparing the moment curve and center curve.

  3. Renewable Distributed Generation Models in Three-Phase Load Flow Analysis for Smart Grid

    Directory of Open Access Journals (Sweden)

    K. M. Nor

    2013-11-01

    Full Text Available The paper presents renewable distributed generation  (RDG models as three-phase resource in load flow computation and analyzes their effect when they are connected in composite networks. The RDG models that have been considered comprise of photovoltaic (PV and wind turbine generation (WTG. The voltage-controlled node and complex power injection node are used in the models. These improvement models are suitable for smart grid power system analysis. The combination of IEEE transmission and distribution data used to test and analyze the algorithm in solving balanced/unbalanced active systems. The combination of IEEE transmission data and IEEE test feeder are used to test the the algorithm for balanced and unbalanced multi-phase distribution system problem. The simulation results show that by increased number and size of RDG units have improved voltage profile and reduced system losses.

  4. In-core flow rate distribution measurement test of the JOYO irradiation core

    International Nuclear Information System (INIS)

    Suzuki, Toshihiro; Isozaki, Kazunori; Suzuki, Soju

    1996-01-01

    A flow rate distribution measurement test was carried out for the JOYO irradiation core (the MK-II core) after the 29th duty cycle operation. The main object of the test is to confirm the proper flow rate distribution at the final phase of the MK-II core. The each flow rate at the outlet of subassemblies was measured by the permanent magnetic flowmeter inserted avail of fuel exchange hole in the rotating plug. This is third test in the MK-II core, after 10 years absence from the final test (1985). Total of 550 subassemblies were exchanged and accumulated reactor operation time reached up to 38,000 hours from the previous test. As a conclusion, it confirmed that the flow rate distribution has been kept suitable in the final phase of the MK-II core. (author)

  5. The Application of Hardware in the Loop Testing for Distributed Engine Control

    Science.gov (United States)

    Thomas, George L.; Culley, Dennis E.; Brand, Alex

    2016-01-01

    The essence of a distributed control system is the modular partitioning of control function across a hardware implementation. This type of control architecture requires embedding electronics in a multitude of control element nodes for the execution of those functions, and their integration as a unified system. As the field of distributed aeropropulsion control moves toward reality, questions about building and validating these systems remain. This paper focuses on the development of hardware-in-the-loop (HIL) test techniques for distributed aero engine control, and the application of HIL testing as it pertains to potential advanced engine control applications that may now be possible due to the intelligent capability embedded in the nodes.

  6. Wind tunnel test IA300 analysis and results, volume 1

    Science.gov (United States)

    Kelley, P. B.; Beaufait, W. B.; Kitchens, L. L.; Pace, J. P.

    1987-01-01

    The analysis and interpretation of wind tunnel pressure data from the Space Shuttle wind tunnel test IA300 are presented. The primary objective of the test was to determine the effects of the Space Shuttle Main Engine (SSME) and the Solid Rocket Booster (SRB) plumes on the integrated vehicle forebody pressure distributions, the elevon hinge moments, and wing loads. The results of this test will be combined with flight test results to form a new data base to be employed in the IVBC-3 airloads analysis. A secondary objective was to obtain solid plume data for correlation with the results of gaseous plume tests. Data from the power level portion was used in conjunction with flight base pressures to evaluate nominal power levels to be used during the investigation of changes in model attitude, eleveon deflection, and nozzle gimbal angle. The plume induced aerodynamic loads were developed for the Space Shuttle bases and forebody areas. A computer code was developed to integrate the pressure data. Using simplified geometrical models of the Space Shuttle elements and components, the pressure data were integrated to develop plume induced force and moments coefficients that can be combined with a power-off data base to develop a power-on data base.

  7. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  8. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  9. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  10. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  11. Analysis and Testing of Mobile Wireless Networks

    Science.gov (United States)

    Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.

  12. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  13. Nanoscale Test Strips for Multiplexed Blood Analysis

    Science.gov (United States)

    Chan, Eugene

    2015-01-01

    A critical component of the DNA Medicine Institute's Reusable Handheld Electrolyte and Lab Technology for Humans (rHEALTH) sensor are nanoscale test strips, or nanostrips, that enable multiplexed blood analysis. Nanostrips are conceptually similar to the standard urinalysis test strip, but the strips are shrunk down a billionfold to the microscale. Each nanostrip can have several sensor pads that fluoresce in response to different targets in a sample. The strips carry identification tags that permit differentiation of a specific panel from hundreds of other nanostrip panels during a single measurement session. In Phase I of the project, the company fabricated, tested, and demonstrated functional parathyroid hormone and vitamin D nanostrips for bone metabolism, and thrombin aptamer and immunoglobulin G antibody nanostrips. In Phase II, numerous nanostrips were developed to address key space flight-based medical needs: assessment of bone metabolism, immune response, cardiac status, liver metabolism, and lipid profiles. This unique approach holds genuine promise for space-based portable biodiagnostics and for point-of-care (POC) health monitoring and diagnostics here on Earth.

  14. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  15. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  16. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  17. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  18. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  19. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  20. On the asymptotic distribution of a unit root test against ESTAR alternatives

    NARCIS (Netherlands)

    Hanck, Christoph

    We derive the null distribution of the nonlinear unit root test proposed in Kapetanios et al. [Kapetanios, G., Shin, Y., Snell, A., 2003. Testing for a unit root in the nonlinear STAR framework, journal of Econometrics 112, 359-379] when nonzero means or both means and deterministic trends are

  1. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  2. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  3. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  4. Test of methods for retrospective activity size distribution determination from filter samples

    International Nuclear Information System (INIS)

    Meisenberg, Oliver; Tschiersch, Jochen

    2015-01-01

    Determining the activity size distribution of radioactive aerosol particles requires sophisticated and heavy equipment, which makes measurements at large number of sites difficult and expensive. Therefore three methods for a retrospective determination of size distributions from aerosol filter samples in the laboratory were tested for their applicability. Extraction into a carrier liquid with subsequent nebulisation showed size distributions with a slight but correctable bias towards larger diameters compared with the original size distribution. Yields in the order of magnitude of 1% could be achieved. Sonication-assisted extraction into a carrier liquid caused a coagulation mode to appear in the size distribution. Sonication-assisted extraction into the air did not show acceptable results due to small yields. The method of extraction into a carrier liquid without sonication was applied to aerosol samples from Chernobyl in order to calculate inhalation dose coefficients for 137 Cs based on the individual size distribution. The effective dose coefficient is about half of that calculated with a default reference size distribution. - Highlights: • Activity size distributions can be recovered after aerosol sampling on filters. • Extraction into a carrier liquid and subsequent nebulisation is appropriate. • This facilitates the determination of activity size distributions for individuals. • Size distributions from this method can be used for individual dose coefficients. • Dose coefficients were calculated for the workers at the new Chernobyl shelter

  5. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  6. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  7. A New Wind Turbine Generating System Model for Balanced and Unbalanced Distribution Systems Load Flow Analysis

    Directory of Open Access Journals (Sweden)

    Ahmet Koksoy

    2018-03-01

    Full Text Available Wind turbine generating systems (WTGSs, which are conventionally connected to high voltage transmission networks, have frequently been employed as distributed generation units in today’s distribution networks. In practice, the distribution networks always have unbalanced bus voltages and line currents due to uneven distribution of single or double phase loads over three phases and asymmetry of the lines, etc. Accordingly, in this study, for the load flow analysis of the distribution networks, Conventional Fixed speed Induction Generator (CFIG based WTGS, one of the most widely used WTGS types, is modelled under unbalanced voltage conditions. The Developed model has active and reactive power expressions in terms of induction machine impedance parameters, terminal voltages and input power. The validity of the Developed model is confirmed with the experimental results obtained in a test system. The results of the slip calculation based phase-domain model (SCP Model, which was previously proposed in the literature for CFIG based WTGSs under unbalanced voltages, are also given for the comparison. Finally, the Developed model and the SCP model are implemented in the load flow analysis of the IEEE 34 bus test system with the CFIG based WTGSs and unbalanced loads. Thus, it is clearly pointed out that the results of the load flow analysis implemented with both models are very close to each other, and the Developed model is computationally more efficient than the SCP model.

  8. Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shoman, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generate performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.

  9. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  10. Conventional fuel tank blunt impact tests : test and analysis results

    Science.gov (United States)

    2014-04-02

    The Federal Railroad Administrations Office of Research : and Development is conducting research into fuel tank : crashworthiness. A series of impact tests are planned to : measure fuel tank deformation under two types of dynamic : loading conditi...

  11. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    Science.gov (United States)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  12. Spectrometric Analysis for Pulse Jet Mixer Testing

    International Nuclear Information System (INIS)

    ZEIGLER, KRISTINE

    2004-01-01

    The Analytical Development Section (ADS) was tasked with providing support for a Hanford River Protection Program-Waste Treatment Program (RPP-WTP) project test involving absorption analysis for non-Newtonian pulse jet mixer testing for small scale (PJM) and prototype (CRV) tanks with sparging. Tanks filled with clay were mixed with various amounts of powdered dye as a tracer. The objective of the entire project was to determine the best mixing protocol (nozzle velocity, number of spargers used, total air flow, etc.) by determining the percent mixed volume through the use of an ultraviolet-visible (UV-Vis) spectrometer. The dye concentration within the sample could be correlated to the volume fraction mixed in the tank. Samples were received in vials, a series of dilutions were generated from the clay, allowed to equilibrate, then centrifuged and siphoned for the supernate liquid to analyze by absorption spectroscopy. Equilibration of the samples and thorough mixing of the samples were a continuous issue with dilution curves being difficult to obtain. Despite these technical issues, useful data was obtained for evaluation of various mix conditions

  13. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  14. ON ESTIMATION AND HYPOTHESIS TESTING OF THE GRAIN SIZE DISTRIBUTION BY THE SALTYKOV METHOD

    Directory of Open Access Journals (Sweden)

    Yuri Gulbin

    2011-05-01

    Full Text Available The paper considers the problem of validity of unfolding the grain size distribution with the back-substitution method. Due to the ill-conditioned nature of unfolding matrices, it is necessary to evaluate the accuracy and precision of parameter estimation and to verify the possibility of expected grain size distribution testing on the basis of intersection size histogram data. In order to review these questions, the computer modeling was used to compare size distributions obtained stereologically with those possessed by three-dimensional model aggregates of grains with a specified shape and random size. Results of simulations are reported and ways of improving the conventional stereological techniques are suggested. It is shown that new improvements in estimating and testing procedures enable grain size distributions to be unfolded more efficiently.

  15. Distribution and histologic effects of intravenously administered amorphous nanosilica particles in the testes of mice

    International Nuclear Information System (INIS)

    Morishita, Yuki; Yoshioka, Yasuo; Satoh, Hiroyoshi; Nojiri, Nao; Nagano, Kazuya; Abe, Yasuhiro; Kamada, Haruhiko; Tsunoda, Shin-ichi; Nabeshi, Hiromi; Yoshikawa, Tomoaki; Tsutsumi, Yasuo

    2012-01-01

    Highlights: ► There is rising concern regarding the potential health risks of nanomaterials. ► Few studies have investigated the effect of nanomaterials on the reproductive system. ► Here, we evaluated the intra-testicular distribution of nanosilica particles. ► We showed that nanosilica particles can penetrate the blood-testis barrier. ► These data provide basic information on ways to create safer nanomaterials. -- Abstract: Amorphous nanosilica particles (nSP) are being utilized in an increasing number of applications such as medicine, cosmetics, and foods. The reduction of the particle size to the nanoscale not only provides benefits to diverse scientific fields but also poses potential risks. Several reports have described the in vivo and in vitro toxicity of nSP, but few studies have examined their effects on the male reproductive system. The aim of this study was to evaluate the testicular distribution and histologic effects of systemically administered nSP. Mice were injected intravenously with nSP with diameters of 70 nm (nSP70) or conventional microsilica particles with diameters of 300 nm (nSP300) on two consecutive days. The intratesticular distribution of these particles 24 h after the second injection was analyzed by transmission electron microscopy. nSP70 were detected within sertoli cells and spermatocytes, including in the nuclei of spermatocytes. No nSP300 were observed in the testis. Next, mice were injected intravenously with 0.4 or 0.8 mg nSP70 every other day for a total of four administrations. Testes were harvested 48 h and 1 week after the last injection and stained with hematoxylin–eosin for histologic analysis. Histologic findings in the testes of nSP70-treated mice did not differ from those of control mice. Taken together, our results suggest that nSP70 can penetrate the blood-testis barrier and the nuclear membranes of spermatocytes without producing apparent testicular injury.

  16. Distribution and histologic effects of intravenously administered amorphous nanosilica particles in the testes of mice

    Energy Technology Data Exchange (ETDEWEB)

    Morishita, Yuki [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Yoshioka, Yasuo, E-mail: yasuo@phs.osaka-u.ac.jp [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Satoh, Hiroyoshi; Nojiri, Nao [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Nagano, Kazuya [Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); Abe, Yasuhiro [Cancer Biology Research Center, Sanford Research/USD, 2301 E. 60th Street N, Sioux Falls, SD 57104 (United States); Kamada, Haruhiko; Tsunoda, Shin-ichi [Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Nabeshi, Hiromi [Division of Foods, National Institute of Health Sciences, 1-18-1, Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Yoshikawa, Tomoaki [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Tsutsumi, Yasuo, E-mail: ytsutsumi@phs.osaka-u.ac.jp [Laboratory of Toxicology and Safety Science, Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Laboratory of Biopharmaceutical Research, National Institute of Biomedical Innovation, 7-6-8 Saitoasagi, Ibaraki, Osaka 567-0085 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2012-04-06

    Highlights: Black-Right-Pointing-Pointer There is rising concern regarding the potential health risks of nanomaterials. Black-Right-Pointing-Pointer Few studies have investigated the effect of nanomaterials on the reproductive system. Black-Right-Pointing-Pointer Here, we evaluated the intra-testicular distribution of nanosilica particles. Black-Right-Pointing-Pointer We showed that nanosilica particles can penetrate the blood-testis barrier. Black-Right-Pointing-Pointer These data provide basic information on ways to create safer nanomaterials. -- Abstract: Amorphous nanosilica particles (nSP) are being utilized in an increasing number of applications such as medicine, cosmetics, and foods. The reduction of the particle size to the nanoscale not only provides benefits to diverse scientific fields but also poses potential risks. Several reports have described the in vivo and in vitro toxicity of nSP, but few studies have examined their effects on the male reproductive system. The aim of this study was to evaluate the testicular distribution and histologic effects of systemically administered nSP. Mice were injected intravenously with nSP with diameters of 70 nm (nSP70) or conventional microsilica particles with diameters of 300 nm (nSP300) on two consecutive days. The intratesticular distribution of these particles 24 h after the second injection was analyzed by transmission electron microscopy. nSP70 were detected within sertoli cells and spermatocytes, including in the nuclei of spermatocytes. No nSP300 were observed in the testis. Next, mice were injected intravenously with 0.4 or 0.8 mg nSP70 every other day for a total of four administrations. Testes were harvested 48 h and 1 week after the last injection and stained with hematoxylin-eosin for histologic analysis. Histologic findings in the testes of nSP70-treated mice did not differ from those of control mice. Taken together, our results suggest that nSP70 can penetrate the blood-testis barrier and the

  17. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  18. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  19. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  20. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  1. Statistical Analysis of Video Frame Size Distribution Originating from Scalable Video Codec (SVC

    Directory of Open Access Journals (Sweden)

    Sima Ahmadpour

    2017-01-01

    Full Text Available Designing an effective and high performance network requires an accurate characterization and modeling of network traffic. The modeling of video frame sizes is normally applied in simulation studies and mathematical analysis and generating streams for testing and compliance purposes. Besides, video traffic assumed as a major source of multimedia traffic in future heterogeneous network. Therefore, the statistical distribution of video data can be used as the inputs for performance modeling of networks. The finding of this paper comprises the theoretical definition of distribution which seems to be relevant to the video trace in terms of its statistical properties and finds the best distribution using both the graphical method and the hypothesis test. The data set used in this article consists of layered video traces generating from Scalable Video Codec (SVC video compression technique of three different movies.

  2. Westinghouse-GOTHIC distributed parameter modelling for HDR test E11.2

    International Nuclear Information System (INIS)

    Narula, J.S.; Woodcock, J.

    1994-01-01

    The Westinghouse-GOTHIC (WGOTHIC) code is a sophisticated mathematical computer code designed specifically for the thermal hydraulic analysis of nuclear power plant containment and auxiliary buildings. The code is capable of sophisticated flow analysis via the solution of mass, momentum, and energy conservation equations. Westinghouse has investigated the use of subdivided noding to model the flow patterns of hydrogen following its release into a containment atmosphere. For the investigation, several simple models were constructed to represent a scale similar to the German HDR containment. The calculational models were simplified to test the basic capability of the plume modeling methods to predict stratification while minimizing the number of parameters. A large empty volume was modeled, with the same volume and height as HDR. A scenario was selected that would be expected to stably stratify, and the effects of noding on the prediction of stratification was studied. A single phase hot gas was injected into the volume at a height similar to that of HDR test E11.2, and there were no heat sinks modeled. Helium was released into the calculational models, and the resulting flow patterns were judged relative to the expected results. For each model, only the number of subdivisions within the containment volume was varied. The results of the investigation of noding schemes has provided evidence of the capability of subdivided (distributed parameter) noding. The results also showed that highly inaccurate flow patterns could be obtained by using an insufficient number of subdivided nodes. This presents a significant challenge to the containment analyst, who must weigh the benefits of increased noding with the penalties the noding may incur on computational efficiency. Clearly, however, an incorrect noding choice may yield erroneous results even if great care has been taken in modeling accurately all other characteristics of containments. (author). 9 refs., 9 figs

  3. First experience and adaptation of existing tools to ATLAS distributed analysis

    International Nuclear Information System (INIS)

    De La Hoz, S.G.; Ruiz, L.M.; Liko, D.

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale in ATLAS. Up to 10000 jobs were processed on about 100 sites in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC file catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval. (orig.)

  4. Componential distribution analysis of food using near infrared ray image

    Science.gov (United States)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  5. Sodium leakage and combustion tests. Measurement and distribution of droplet size using various spray nozzles

    International Nuclear Information System (INIS)

    Nagai, Keiichi; Hirabayashi, Masaru; Onojima, T.; Gunji, Minoru; Ara, Kuniaki; Oki, Yoshihisa

    1999-04-01

    In order to develop a numerical code simulating sodium fires initiated frame dispersion of droplets, measured data of droplet diameter as well as its distribution are needed. In the present experiment the distribution of droplet diameter was measured using water, oil and sodium. The tests elucidated the influential factors with respect to the droplet diameter. In addition, we sought to develop a similarity law between water and sodium. The droplet size distribution of sodium using the large diameter droplet (Elnozzle) was predicted. (J.P.N.)

  6. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M.L.; Palva, V. [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K. [ABB Corporate Research, Vaasa (Finland)

    1997-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  7. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M L; Palva, V [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K [ABB Corporate Research, Vaasa (Finland)

    1998-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  8. Development of Ada language control software for the NASA power management and distribution test bed

    Science.gov (United States)

    Wright, Ted; Mackin, Michael; Gantose, Dave

    1989-01-01

    The Ada language software developed to control the NASA Lewis Research Center's Power Management and Distribution testbed is described. The testbed is a reduced-scale prototype of the electric power system to be used on space station Freedom. It is designed to develop and test hardware and software for a 20-kHz power distribution system. The distributed, multiprocessor, testbed control system has an easy-to-use operator interface with an understandable English-text format. A simple interface for algorithm writers that uses the same commands as the operator interface is provided, encouraging interactive exploration of the system.

  9. Test and Evaluation Station (TESt) - A Control System for the ALICE-HMPID Liquid Distribution Prototype

    CERN Document Server

    Maatta, E; CERN. Geneva; Swoboda, Detlef; Lecoeur, G

    1999-01-01

    The sub-detectors and systems in the ALICE experiment [1] are of various types. However, during physics runs, all devices necessary for the operation of the detector must be accessible and controllable through a common computer interface. Throughout all other periods each sub-detector requires maintenance, upgrading or test operation. To this end, an access independant of other sub-detectors must be guaranteed. These basic requirements impose a fair number of constraints on the architecture and components of the Detector Control System (DCS). The purpose of the TESt project consisted in the construction of a stand alone unit for a specific sub-system of an ALICE detector in order to gain first experience with commercial products for detector control. Although the control system includes only a small number of devices and is designed for a particular application, it covers nevertheless all layers of a complete system and can be extended or used in different applications. The control system prototype has been...

  10. Basic distribution free identification tests for small size samples of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Federico, A.G.; Musmeci, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data. [Italiano] Nell`analisi di dati ambientali ricorre spesso il caso di dover sottoporre a test l`ipotesi di provenienza di due, o piu`, insiemi di dati dalla stessa popolazione. Tipicamente i dati disponibili sono pochi e spesso l`ipotesi di provenienza da distribuzioni normali non e` sostenibile. D`altra aprte la diffusione odierna di Personal Computer fornisce nuove possibili soluzioni basate sull`uso intensivo delle risorse della CPU. Il rapporto analizza il problema e presenta la possibilita` di utilizzo di due test non parametrici basati sulle proprieta` intrinseche di equiprobabilita` dei campioni. Il primo e` basato su una tecnica di ricampionamento esaustivo mentre il secondo su un approccio di tipo bootstrap. E` presentato un programma di semplice utilizzo e un caso di studio basato su dati di contaminazione di bambini a Chernobyl.

  11. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  12. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  13. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  14. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  15. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  16. A Generic Danish Distribution Grid Model for Smart Grid Technology Testing

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Wu, Qiuwei; Østergaard, Jacob

    2012-01-01

    This paper describes the development of a generic Danish distribution grid model for smart grid technology testing based on the Bornholm power system. The frequency dependent network equivalent (FDNE) method has been used in order to accurately preserve the desired properties and characteristics...... as a generic Smart Grid benchmark model for testing purposes....... by comparing the transient response of the original Bornholm power system model and the developed generic model under significant fault conditions. The results clearly show that the equivalent generic distribution grid model retains the dynamic characteristics of the original system, and can be used...

  17. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  18. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  19. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  20. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  1. Measurement of distribution coefficients using a radial injection dual-tracer test

    International Nuclear Information System (INIS)

    Pickens, J.F.; Jackson, R.E.; Inch, K.J.; Merritt, W.F.

    1981-01-01

    The dispersive and adsorptive properties of a sandy aquifer were evaluated by using a radial injection dual-tracer test with 131 I as the nonreactive tracer and 85 Sr as the reactive tracer. The tracer migration was monitored by using multilevel point-sampling devices located at various radial distances and depths. Nonequilibrium physical and chemical adsorption effects for 85 Sr were treated as a spreading or dispersion mechanism in the breakthrough curve analysis. The resulting effective dispersivity values for 85 Sr were typically a factor of 2 to 5 larger than those obtained for 131 I. The distribution coefficient (K/sub d//sup Sr/) values obtained from analysis of the breakthrough curves at three depths and two radial distances ranged from 2.6 to 4.5 ml/g. These compare favorably with values obtained by separation of fluids from solids in sediment cores, by batch experiments on core sediments and by analysis of a 25-year-old radioactive waste plume in another part of the same aquifer. Correlations of adsorbed 85 Sr radioactivity with grain size fractions demonstrated preferential adsorption to the coarsest fraction and to the finest fraction. The relative amounts of electrostatically and specifically adsorbed 85 Sr on the aquifer sediments were determined with desorption experiments on core sediments using selective chemical extractants. The withdrawal phase breakthrough curves for the well, obtained immediately following the injection phase, showed essentially full tracer recoveries for both 131 I and 85 Sr. Relatively slow desorption of 85 Sr provided further indication of the nonequilibrium nature of the adsorption-desorption phenomena

  2. Simple Algorithms to Calculate Asymptotic Null Distributions of Robust Tests in Case-Control Genetic Association Studies in R

    Directory of Open Access Journals (Sweden)

    Wing Kam Fung

    2010-02-01

    Full Text Available The case-control study is an important design for testing association between genetic markers and a disease. The Cochran-Armitage trend test (CATT is one of the most commonly used statistics for the analysis of case-control genetic association studies. The asymptotically optimal CATT can be used when the underlying genetic model (mode of inheritance is known. However, for most complex diseases, the underlying genetic models are unknown. Thus, tests robust to genetic model misspecification are preferable to the model-dependant CATT. Two robust tests, MAX3 and the genetic model selection (GMS, were recently proposed. Their asymptotic null distributions are often obtained by Monte-Carlo simulations, because they either have not been fully studied or involve multiple integrations. In this article, we study how components of each robust statistic are correlated, and find a linear dependence among the components. Using this new finding, we propose simple algorithms to calculate asymptotic null distributions for MAX3 and GMS, which greatly reduce the computing intensity. Furthermore, we have developed the R package Rassoc implementing the proposed algorithms to calculate the empirical and asymptotic p values for MAX3 and GMS as well as other commonly used tests in case-control association studies. For illustration, Rassoc is applied to the analysis of case-control data of 17 most significant SNPs reported in four genome-wide association studies.

  3. Geographically distributed hybrid testing & collaboration between geotechnical centrifuge and structures laboratories

    Science.gov (United States)

    Ojaghi, Mobin; Martínez, Ignacio Lamata; Dietz, Matt S.; Williams, Martin S.; Blakeborough, Anthony; Crewe, Adam J.; Taylor, Colin A.; Madabhushi, S. P. Gopal; Haigh, Stuart K.

    2018-01-01

    Distributed Hybrid Testing (DHT) is an experimental technique designed to capitalise on advances in modern networking infrastructure to overcome traditional laboratory capacity limitations. By coupling the heterogeneous test apparatus and computational resources of geographically distributed laboratories, DHT provides the means to take on complex, multi-disciplinary challenges with new forms of communication and collaboration. To introduce the opportunity and practicability afforded by DHT, here an exemplar multi-site test is addressed in which a dedicated fibre network and suite of custom software is used to connect the geotechnical centrifuge at the University of Cambridge with a variety of structural dynamics loading apparatus at the University of Oxford and the University of Bristol. While centrifuge time-scaling prevents real-time rates of loading in this test, such experiments may be used to gain valuable insights into physical phenomena, test procedure and accuracy. These and other related experiments have led to the development of the real-time DHT technique and the creation of a flexible framework that aims to facilitate future distributed tests within the UK and beyond. As a further example, a real-time DHT experiment between structural labs using this framework for testing across the Internet is also presented.

  4. Posterior cerebral artery Wada test: sodium amytal distribution and functional deficits

    Energy Technology Data Exchange (ETDEWEB)

    Urbach, H.; Schild, H.H. [Dept. of Radiology/Neuroradiology, Univ. of Bonn (Germany); Klemm, E.; Biersack, H.J. [Bonn Univ. (Germany). Klinik fuer Nuklearmedizin; Linke, D.B.; Behrends, K.; Schramm, J. [Dept. of Neurosurgery, Univ. of Bonn (Germany)

    2001-04-01

    Inadequate sodium amytal delivery to the posterior hippocampus during the intracarotid Wada test has led to development of selective tests. Our purpose was to show the sodium amytal distribution in the posterior cerebral artery (PCA) Wada test and to relate it to functional deficits during the test. We simultaneously injected 80 mg sodium amytal and 14.8 MBq {sup 99} {sup m}Tc-hexamethylpropyleneamine oxime (HMPAO) into the P2-segment of the PCA in 14 patients with temporal lobe epilepsy. To show the skull, we injected 116 MBq {sup 99} {sup m}Tc-HDP intravenously. Sodium amytal distribution was determined by high-resolution single-photon emission computed tomography (SPECT). In all patients, HMPAO was distributed throughout the parahippocampal gyrus and hippocampus; it was also seen in the occipital lobe in all cases and in the thalamus in 11. Eleven patients were awake and cooperative; one was slightly uncooperative due to speech comprehension difficulties and perseveration. All patients showed contralateral hemianopia during the test. Four patients had nominal dysphasia for 1-3 min. None developed motor deficits or had permanent neurological deficits. Neurological deficits due to inactivation of extrahippocampal areas thus do not grossly interfere with neuropsychological testing during the test. (orig.)

  5. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    Science.gov (United States)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  6. An environmental testing facility for Space Station Freedom power management and distribution hardware

    Science.gov (United States)

    Jackola, Arthur S.; Hartjen, Gary L.

    1992-01-01

    The plans for a new test facility, including new environmental test systems, which are presently under construction, and the major environmental Test Support Equipment (TSE) used therein are addressed. This all-new Rocketdyne facility will perform space simulation environmental tests on Power Management and Distribution (PMAD) hardware to Space Station Freedom (SSF) at the Engineering Model, Qualification Model, and Flight Model levels of fidelity. Testing will include Random Vibration in three axes - Thermal Vacuum, Thermal Cycling and Thermal Burn-in - as well as numerous electrical functional tests. The facility is designed to support a relatively high throughput of hardware under test, while maintaining the high standards required for a man-rated space program.

  7. Quality parameters analysis of optical imaging systems with enhanced focal depth using the Wigner distribution function

    Science.gov (United States)

    Zalvidea; Colautti; Sicre

    2000-05-01

    An analysis of the Strehl ratio and the optical transfer function as imaging quality parameters of optical elements with enhanced focal length is carried out by employing the Wigner distribution function. To this end, we use four different pupil functions: a full circular aperture, a hyper-Gaussian aperture, a quartic phase plate, and a logarithmic phase mask. A comparison is performed between the quality parameters and test images formed by these pupil functions at different defocus distances.

  8. Accelerated testing statistical models, test plans, and data analysis

    CERN Document Server

    Nelson, Wayne B

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "". . . a goldmine of knowledge on accelerated life testing principles and practices . . . one of the very few capable of advancing the science of reliability. It definitely belongs in every bookshelf on engineering.""-Dev G.

  9. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  10. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-12-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  11. Testing nuclear parton distributions with pA collisions at the LHC

    CERN Document Server

    Quiroga-Arias, Paloma; Wiedemann, Urs Achim

    2010-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non-linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program at...

  12. Testing collinear factorization and nuclear parton distributions with pA collisions at the LHC

    CERN Document Server

    Quiroga-Arias, Paloma; Wiedemann, Urs Achim

    2011-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non- linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program a...

  13. Standard test method for distribution coefficients of inorganic species by the batch method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the determination of distribution coefficients of chemical species to quantify uptake onto solid materials by a batch sorption technique. It is a laboratory method primarily intended to assess sorption of dissolved ionic species subject to migration through pores and interstices of site specific geomedia. It may also be applied to other materials such as manufactured adsorption media and construction materials. Application of the results to long-term field behavior is not addressed in this method. Distribution coefficients for radionuclides in selected geomedia are commonly determined for the purpose of assessing potential migratory behavior of contaminants in the subsurface of contaminated sites and waste disposal facilities. This test method is also applicable to studies for parametric studies of the variables and mechanisms which contribute to the measured distribution coefficient. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement a...

  14. Modeling of the CIGRE Low Voltage Test Distribution Network and the Development of Appropriate Controllers

    DEFF Research Database (Denmark)

    Mustafa, Ghullam; Bak-Jensen, Birgitte; Mahat, Pukar

    2013-01-01

    The fluctuating nature of some of the Distributed Generation (DG) sources can cause power quality related problems like power frequency oscillations, voltage fluctuations etc. In future, the DG penetration is expected to increase and hence this requires some control actions to deal with the power...... controller. The control system is tested in the distribution test network set up by CIGRE. The new approach of the PV controller is done in such a way that it can control AC and DC voltage of the PV converter during dynamic conditions. The battery controller is also developed in such a way that it can...... quality issues. The main focus of this paper is on development of controllers for a distribution system with different DG’s and especially development of a Photovoltaic (PV) controller using a Static Compensator (STATCOM) controller and on modeling of a Battery Storage System (BSS) also based on a STATCOM...

  15. Testing species distribution models across space and time: high latitude butterflies and recent warming

    DEFF Research Database (Denmark)

    Eskildsen, Anne; LeRoux, Peter C.; Heikkinen, Risto K.

    2013-01-01

    changes at expanding range margins can be predicted accurately. Location. Finland. Methods. Using 10-km resolution butterfly atlas data from two periods, 1992–1999 (t1) and 2002–2009 (t2), with a significant between-period temperature increase, we modelled the effects of climatic warming on butterfly...... butterfly distributions under climate change. Model performance was lower with independent compared to non-independent validation and improved when land cover and soil type variables were included, compared to climate-only models. SDMs performed less well for highly mobile species and for species with long......Aim. To quantify whether species distribution models (SDMs) can reliably forecast species distributions under observed climate change. In particular, to test whether the predictive ability of SDMs depends on species traits or the inclusion of land cover and soil type, and whether distributional...

  16. Synchronous Design and Test of Distributed Passive Radar Systems Based on Digital Broadcasting and Television

    Directory of Open Access Journals (Sweden)

    Wan Xianrong

    2017-02-01

    Full Text Available Digital broadcasting and television are important classes of illuminators of opportunity for passive radars. Distributed and multistatic structure are the development trends for passive radars. Most modern digital broadcasting and television systems work on a network, which not only provides a natural condition to distributed passive radar but also puts forward higher requirements on the design of passive radar systems. Among those requirements, precise synchronization among the receivers and transmitters as well as among multiple receiving stations, which mainly involves frequency and time synchronization, is the first to be solved. To satisfy the synchronization requirements of distributed passive radars, a synchronization scheme based on GPS is presented in this paper. Moreover, an effective scheme based on the China Mobile Multimedia Broadcasting signal is proposed to test the system synchronization performance. Finally, the reliability of the synchronization design is verified via the distributed multistatic passive radar experiments.

  17. U.S.: proposed federal legislation to allow condom distribution and HIV testing in prison.

    Science.gov (United States)

    Dolinsky, Anna

    2007-05-01

    Representative Barbara Lee (D-CA) is reintroducing legislation in the U.S. House of Representatives that would require federal correctional facilities to allow community organizations to distribute condoms and provide voluntary counselling and testing for HIV and STDs for inmates. The bill has been referred to the House Judiciary Committee's Subcommittee on Crime, Terrorism, and Homeland Security.

  18. The design and test of VME clock distribution module of the Daya Bay RPC readout system

    International Nuclear Information System (INIS)

    Zhao Heng; Liang Hao; Zhou Yongzhao

    2011-01-01

    It describes the design of the VME Clock Distribution module of the Daya Bay RPC readout system, including the function and the hardware structure of the module and the logic design of the FPGA on the module. After the building and debugging of the module, a series of tests have been made to check its function and stability. (authors)

  19. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  20. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  1. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  2. Failure propagation tests and analysis at PNC

    International Nuclear Information System (INIS)

    Tanabe, H.; Miyake, O.; Daigo, Y.; Sato, M.

    1984-01-01

    Failure propagation tests have been conducted using the Large Leak Sodium Water Reaction Test Rig (SWAT-1) and the Steam Generator Safety Test Facility (SWAT-3) at PNC in order to establish the safety design of the LMFBR prototype Monju steam generators. Test objectives are to provide data for selecting a design basis leak (DBL), data on the time history of failure propagations, data on the mechanism of the failures, and data on re-use of tubes in the steam generators that have suffered leaks. Eighteen fundamental tests have been performed in an intermediate leak region using the SWAT-1 test rig, and ten failure propagation tests have been conducted in the region from a small leak to a large leak using the SWAT-3 test facility. From the test results it was concluded that a dominant mechanism was tube wastage, and it took more than one minute until each failure propagation occurred. Also, the total leak rate in full sequence simulation tests including a water dump was far less than that of one double-ended-guillotine (DEG) failure. Using such experimental data, a computer code, LEAP (Leak Enlargement and Propagation), has been developed for the purpose of estimating the possible maximum leak rate due to failure propagation. This paper describes the results of the failure propagation tests and the model structure and validation studies of the LEAP code. (author)

  3. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  4. MCNP(TM) Release 6.1.1 beta: Creating and Testing the Code Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Lawrence J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Casswell, Laura [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-06-12

    This report documents the preparations for and testing of the production release of MCNP6™1.1 beta through RSICC at ORNL. It addresses tests on supported operating systems (Linux, MacOSX, Windows) with the supported compilers (Intel, Portland Group and gfortran). Verification and Validation test results are documented elsewhere. This report does not address in detail the overall packaging of the distribution. Specifically, it does not address the nuclear and atomic data collection, the other included software packages (MCNP5, MCNPX and MCNP6) and the collection of reference documents.

  5. Test determination with tritium as a radioactive tracer of the residence time distribution in the stability pool for Cabrero sewage

    International Nuclear Information System (INIS)

    Diaz, Francisco; Duran, Oscar; Henriquez, Pedro; Vega, Pedro; Padilla, Liliana; Gonzalez, David; Garcia Agudo, Edmundo

    2000-01-01

    This work was prepared by the Chilean and International Atomic Energy Agencies and covers the hydrodynamic functioning of sewage stability pools using tracers. The plant selected in the city of Cabrero, 500 km. south of Santiago, and is a rectangular facultative pool with a surface area of 7100 m 2 and a maximum volume of 12,327 m2 that receives an average flow of 20 l/s, serving a population of 7000 individuals. The work aims to characterize the runoff from the flow that enters the pool, using a radioactive tracer test, where the incoming water is marked, and its out-coming passage is determined, to establish the residence time distribution. Tritium was selected in the form of tritiated water as a tracer that is precisely emptied into the water flow from the distribution ravine at the lake entrance. Samples are taken at the outflow to determine the concentration of tritium after distillation, simultaneously measuring the flow, to be analyzed in a liquid flicker counter. An average test time of 5.3 days was obtained and an analysis of the residence time distribution for the tracer shows that it leaves quickly and indicates bad flow distribution in the lake with a major short circuit and probable dead zones

  6. A statistical design for testing apomictic diversification through linkage analysis.

    Science.gov (United States)

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  7. On-line test of power distribution prediction system for boiling water reactors

    International Nuclear Information System (INIS)

    Nishizawa, Y.; Kiguchi, T.; Kobayashi, S.; Takumi, K.; Tanaka, H.; Tsutsumi, R.; Yokomi, M.

    1982-01-01

    A power distribution prediction system for boiling water reactors has been developed and its on-line performance test has proceeded at an operating commercial reactor. This system predicts the power distribution or thermal margin in advance of control rod operations and core flow rate change. This system consists of an on-line computer system, an operator's console with a color cathode-ray tube, and plant data input devices. The main functions of this system are present power distribution monitoring, power distribution prediction, and power-up trajectory prediction. The calculation method is based on a simplified nuclear thermal-hydraulic calculation, which is combined with a method of model identification to the actual reactor core state. It has been ascertained by the on-line test that the predicted power distribution (readings of traversing in-core probe) agrees with the measured data within 6% root-mean-square. The computing time required for one prediction calculation step is less than or equal to 1.5 min by an HIDIC-80 on-line computer

  8. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  9. 21 CFR 809.40 - Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs...

    Science.gov (United States)

    2010-04-01

    ... OTC test sample collection systems for drugs of abuse testing. 809.40 Section 809.40 Food and Drugs... Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs of abuse testing. (a) Over-the-counter (OTC) test sample collection systems for drugs of abuse testing (§ 864.3260...

  10. Optimal design of accelerated life tests for an extension of the exponential distribution

    International Nuclear Information System (INIS)

    Haghighi, Firoozeh

    2014-01-01

    Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed

  11. Analysis of tecniques for measurement of the size distribution of solid particles

    Directory of Open Access Journals (Sweden)

    F. O. Arouca

    2005-03-01

    Full Text Available Determination of the size distribution of solid particles is fundamental for analysis of the performance several pieces of equipment used for solid-fluid separation. The main objective of this work is to compare the results obtained with two traditional methods for determination of the size grade distribution of powdery solids: the gamma-ray attenuation technique (GRAT and the LADEQ test tube technique. The effect of draining the suspension in the two techniques used was also analyzed. The GRAT can supply the particle size distribution of solids through the monitoring of solid concentration in experiments on batch settling of diluted suspensions. The results show that use of the peristaltic pump in the GRAT and the LADEQ methods produced a significant difference between the values obtained for the parameters of the particle size model.

  12. DRAM fault analysis and test generation

    NARCIS (Netherlands)

    Al-Ars, Z.

    2005-01-01

    Dynamic random access memories (DRAMs) are the most widely used type of memory in the market today, due to their important application as the main memory of the personal computer (PC). These memories are tested by their manufacturers in an ad hoc way, that results in an expensive test process the

  13. Analysis of the Rapid Chloride Migration test

    NARCIS (Netherlands)

    Spiesz, P.R.; Ballari, M.; Brouwers, H.J.H.; Ferreira, R. M.; Gulikers, J.; Andrade, C.

    2009-01-01

    In this study the Rapid Chloride Migration test (RCM) standardized as NT Build 492 and BAW-Merkblatt is reviewed. Since the traditional natural diffusion tests are laborious, time consuming and costly, they are not always preferred from a practical point of view. To overcome these disadvantages,

  14. Design and analysis of multiaxial creep tests

    International Nuclear Information System (INIS)

    Mallett, R.H.; Dhalla, A.K.; Yocolano, J.T.

    1974-01-01

    A procedure is described for presenting the complete data as obtained from tests of thin-walled tubular creep test specimens. Thereafter, a procedure for processing the data is presented. The processed data is based in part upon results of detailed inelastic finite element analyses performed to determine uniform and constant stress quantities and effective gage lengths. (U.S.)

  15. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  16. Analysis of acidic properties of distribution transformer oil insulation ...

    African Journals Online (AJOL)

    This paper examined the acidic properties of distribution transformer oil insulation in service at Jericho distribution network Ibadan, Nigeria. Five oil samples each from six distribution transformers (DT1, DT2, DT3, DT4 and DT5) making a total of thirty samples were taken from different installed distribution transformers all ...

  17. Reliability analysis of water distribution systems under uncertainty

    International Nuclear Information System (INIS)

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  18. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  19. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  20. Primitive Path Analysis and Stress Distribution in Highly Strained Macromolecules.

    Science.gov (United States)

    Hsu, Hsiao-Ping; Kremer, Kurt

    2018-01-16

    Polymer material properties are strongly affected by entanglement effects. For long polymer chains and composite materials, they are expected to be at the origin of many technically important phenomena, such as shear thinning or the Mullins effect, which microscopically can be related to topological constraints between chains. Starting from fully equilibrated highly entangled polymer melts, we investigate the effect of isochoric elongation on the entanglement structure and force distribution of such systems. Theoretically, the related viscoelastic response usually is discussed in terms of the tube model. We relate stress relaxation in the linear and nonlinear viscoelastic regimes to a primitive path analysis (PPA) and show that tension forces both along the original paths and along primitive paths, that is, the backbone of the tube, in the stretching direction correspond to each other. Unlike homogeneous relaxation along the chain contour, the PPA reveals a so far not observed long-lived clustering of topological constraints along the chains in the deformed state.

  1. Percentiles of the null distribution of 2 maximum lod score tests.

    Science.gov (United States)

    Ulgen, Ayse; Yoo, Yun Joo; Gordon, Derek; Finch, Stephen J; Mendell, Nancy R

    2004-01-01

    We here consider the null distribution of the maximum lod score (LOD-M) obtained upon maximizing over transmission model parameters (penetrance values, dominance, and allele frequency) as well as the recombination fraction. Also considered is the lod score maximized over a fixed choice of genetic model parameters and recombination-fraction values set prior to the analysis (MMLS) as proposed by Hodge et al. The objective is to fit parametric distributions to MMLS and LOD-M. Our results are based on 3,600 simulations of samples of n = 100 nuclear families ascertained for having one affected member and at least one other sibling available for linkage analysis. Each null distribution is approximately a mixture p(2)(0) + (1 - p)(2)(v). The values of MMLS appear to fit the mixture 0.20(2)(0) + 0.80chi(2)(1.6). The mixture distribution 0.13(2)(0) + 0.87chi(2)(2.8). appears to describe the null distribution of LOD-M. From these results we derive a simple method for obtaining critical values of LOD-M and MMLS. Copyright 2004 S. Karger AG, Basel

  2. Well test analysis in fractured media

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, K.

    1987-04-01

    The behavior of fracture systems under well test conditions and methods for analyzing well test data from fractured media are investigated. Several analytical models are developed to be used for analyzing well test data from fractured media. Numerical tools that may be used to simulate fluid flow in fractured media are also presented. Three types of composite models for constant flux tests are investigated. These models are based on the assumption that a fracture system under well test conditions may be represented by two concentric regions, one representing a small number of fractures that dominates flow near the well, and the other representing average conditions farther away from the well. Type curves are presented that can be used to find the flow parameters of these two regions and the extent of the inner concentric region. Several slug test models with different geometric conditions that may be present in fractured media are also investigated. A finite element model that can simulate transient fluid flow in fracture networks is used to study the behavior of various two-dimensional fracture systems under well test conditions. A mesh generator that can be used to model mass and heat flow in a fractured-porous media is presented.

  3. Using GIFTS on the Cray-1 for the large coil test facility test: stand design analysis

    International Nuclear Information System (INIS)

    Baudry, T.V.; Gray, W.H.

    1981-06-01

    The GIFTS finite element program has been used extensively throughout the Large Coil Test Facility (LCTF) test stand design analysis. Effective use has been made of GIFTS both as a preprocessor to other finite element programs and as a complete structural analysis package. The LCTF test stand design involved stress analysis ranging from simple textbook-type problems to very complicated three-dimensional structural problems. Two areas of the design analysis are discussed

  4. Finite Element Analysis and Test Results Comparison for the Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.

    2016-01-01

    This report documents the comparison of test measurements and predictive finite element analysis results for a hybrid wing body center section test article. The testing and analysis efforts were part of the Airframe Technology subproject within the NASA Environmentally Responsible Aviation project. Test results include full field displacement measurements obtained from digital image correlation systems and discrete strain measurements obtained using both unidirectional and rosette resistive gauges. Most significant results are presented for the critical five load cases exercised during the test. Final test to failure after inflicting severe damage to the test article is also documented. Overall, good comparison between predicted and actual behavior of the test article is found.

  5. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  6. The Space Station Module Power Management and Distribution automation test bed

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  7. Empirically testing the relationship between income distribution, perceived value of money and pay satisfaction

    Directory of Open Access Journals (Sweden)

    Azman Ismail

    2009-07-01

    Full Text Available Compensation management literature highlights that income has three major features: salary, bonus and allowance. If the level and/or amount of income are distributed to employees based on proper rules this may increase pay satisfaction. More importantly, a thorough investigation in this area reveals that the effect of income distribution on pay satisfaction is not consistent if perceived value of money is present in organizations. The nature of this relationship is less emphasized in pay distribution literature. Therefore, this study was conducted to measure the effect of the perceived value of money and income distribution on pay satisfaction using 136 usable questionnaires gathered from employees who have worked in one city based local authority in Sabah, Malaysia (MSLAUTHORITY. Outcomes of hierarchical regression analysis showed that the interaction between perceived value of money and income distribution significantly correlated with pay satisfaction. This result confirms that perceived value of money does act as a moderating variable in the income distribution model of the organizational sample. In addition, discussion and implications of this study are elaborated.

  8. NASA Langley Distributed Propulsion VTOL Tilt-Wing Aircraft Testing, Modeling, Simulation, Control, and Flight Test Development

    Science.gov (United States)

    Rothhaar, Paul M.; Murphy, Patrick C.; Bacon, Barton J.; Gregory, Irene M.; Grauer, Jared A.; Busan, Ronald C.; Croom, Mark A.

    2014-01-01

    Control of complex Vertical Take-Off and Landing (VTOL) aircraft traversing from hovering to wing born flight mode and back poses notoriously difficult modeling, simulation, control, and flight-testing challenges. This paper provides an overview of the techniques and advances required to develop the GL-10 tilt-wing, tilt-tail, long endurance, VTOL aircraft control system. The GL-10 prototype's unusual and complex configuration requires application of state-of-the-art techniques and some significant advances in wind tunnel infrastructure automation, efficient Design Of Experiments (DOE) tunnel test techniques, modeling, multi-body equations of motion, multi-body actuator models, simulation, control algorithm design, and flight test avionics, testing, and analysis. The following compendium surveys key disciplines required to develop an effective control system for this challenging vehicle in this on-going effort.

  9. Test and evaluation of load converter topologies used in the Space Station Freedom power management and distribution dc test bed

    Science.gov (United States)

    Lebron, Ramon C.; Oliver, Angela C.; Bodi, Robert F.

    1991-01-01

    Power components hardware in support of the Space Station freedom dc Electric Power System were tested. One type of breadboard hardware tested is the dc Load Converter Unit, which constitutes the power interface between the electric power system and the actual load. These units are dc to dc converters that provide the final system regulation before power is delivered to the load. Three load converters were tested: a series resonant converter, a series inductor switch-mode converter, and a switching full-bridge forward converter. The topology, operation principles, and test results are described, in general. A comparative analysis of the three units is given with respect to efficiency, regulation, short circuit behavior (protection), and transient characteristics.

  10. Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis

    Directory of Open Access Journals (Sweden)

    Franco Rosselli Pilar

    1997-06-01

    Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.

  11. Observations in the statistical analysis of NBG-18 nuclear graphite strength tests

    International Nuclear Information System (INIS)

    Hindley, Michael P.; Mitchell, Mark N.; Blaine, Deborah C.; Groenwold, Albert A.

    2012-01-01

    Highlights: ► Statistical analysis of NBG-18 nuclear graphite strength test. ► A Weibull distribution and normal distribution is tested for all data. ► A Bimodal distribution in the CS data is confirmed. ► The CS data set has the lowest variance. ► A Combined data set is formed and has Weibull distribution. - Abstract: The purpose of this paper is to report on the selection of a statistical distribution chosen to represent the experimental material strength of NBG-18 nuclear graphite. Three large sets of samples were tested during the material characterisation of the Pebble Bed Modular Reactor and Core Structure Ceramics materials. These sets of samples are tensile strength, flexural strength and compressive strength (CS) measurements. A relevant statistical fit is determined and the goodness of fit is also evaluated for each data set. The data sets are also normalised for ease of comparison, and combined into one representative data set. The validity of this approach is demonstrated. A second failure mode distribution is found on the CS test data. Identifying this failure mode supports the similar observations made in the past. The success of fitting the Weibull distribution through the normalised data sets allows us to improve the basis for the estimates of the variability. This could also imply that the variability on the graphite strength for the different strength measures is based on the same flaw distribution and thus a property of the material.

  12. Well test analysis in fractured media

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, K.

    1986-04-01

    In this study the behavior of fracture systems under well test conditions and methods for analyzing well test data from fractured media are investigated. Several analytical models are developed to be used for analyzing well test data from fractured media. Numerical tools that may be used to simulate fluid flow in fractured media are also presented. Three types of composite models for constant flux tests are investigated. Several slug test models with different geometric conditions that may be present in fractured media are also investigated. A finite element model that can simulate transient fluid flow in fracture networks is used to study the behavior of various two-dimensional fracture systems under well test conditions. A mesh generator that can be used to model mass and heat flow in a fractured-porous media is presented. This model develops an explicit solution in the porous matrix as well as in the discrete fractures. Because the model does not require the assumptions of the conventional double porosity approach, it may be used to simulate cases where double porosity models fail.

  13. Westinghouse-GOTHIC modeling of NUPEC's hydrogen mixing and distribution test M-4-3

    International Nuclear Information System (INIS)

    Ofstun, R.P.; Woodcock, J.; Paulsen, D.L.

    1994-01-01

    NUPEC (NUclear Power Engineering Corporation) ran a series of hydrogen mixing and distribution tests which were completed in April 1992. These tests were performed in a 1/4 linearly scaled model containment and were specifically designed to be used for computer code validation. The results of test M-4-3, along with predictions from several computer codes, were presented to the participants of ISP-35 (a blind test comparison of code calculated results with data from NUPEC test M-7-1) at a meeting in March 1993. Test M-4-3, which was similar to test M-7-1, released a mixture of steam and helium into a steam generator compartment located on the lower level of containment. The majority of codes did well at predicting the global pressure and temperature trends, however, some typical lumped parameter modeling problems were identified at that time. In particular, the models had difficulty predicting the temperature and helium concentrations in the so called 'dead ended volumes' (pressurizer compartment and in-core chase region). Modeling of the dead-ended compartments using a single lumped parameter volume did not yield the appropriate temperature and helium response within that volume. The Westinghouse-GOTHIC (WGOTHIC) computer code is capable of modeling in one, two or three dimensions (or any combination thereof). This paper describes the WGOTHIC modeling of the dead-ended compartments for NUPEC test M-4-3 and gives comparisons to the test data. 1 ref., 1 tab., 14 figs

  14. Temperature Distribution Simulation of a Polymer Bearing Basing on the Real Tribological Tests

    Directory of Open Access Journals (Sweden)

    Artur Król

    2015-09-01

    Full Text Available Polymer bearings are widely used due to dry-lubrication mechanism, low weight, corrosion resistance and free maintenance. They are applied in different tribological pairs, i.e. household appliances, mechatronics systems, medical devices, food machines and many more. However their use is limited by high coefficient of thermal expansion and softening at elevated temperature, especially when working outside recommended pv factors. The modification of bearing design to achieve better characteristics at more demanding conditions, requires full understanding of mechanical and thermal phenomena of bearing work. The first step was to observe a thermal behavior of polymer bearing under real test conditions (50, 100, 150 rpm and 350 and 700N until constant values were achieved, i.e. temperature and moment of friction. Subsequently collected data were used in a design of temperature distribution model. Thermal simulations of the polymer bearing were done using commercial software package ANSYS Fluent, which is based on finite volume method. All calculations were performed for 3D geometrical model that included polymer bearing, its housing, shaft and some volume of the surrounding air. The heat generation caused by friction forces was implemented by volumetric heat source. All three main heat transfer mechanism (conduction, convection and radiation were included in heat transfer calculations and the air flow around the bearing and adjacent parts was directly solved. The unknown parameters of the numerical model were adjusted by comparison of the results from computer calculations with the measured temperature rise. In the presented work the calculations were limited to steady state conditions only, but the model may be also used in transient analysis.DOI: http://dx.doi.org/10.5755/j01.ms.21.3.7342

  15. Comment on the asymptotics of a distribution-free goodness of fit test statistic.

    Science.gov (United States)

    Browne, Michael W; Shapiro, Alexander

    2015-03-01

    In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.

  16. Bremsstrahlung converter debris shields: test and analysis

    International Nuclear Information System (INIS)

    Reedy, E.D. Jr.; Perry, F.C.

    1983-10-01

    Electron beam accelerators are commonly used to create bremsstrahlung x-rays for effects testing. Typically, the incident electron beam strikes a sandwich of three materials: (1) a conversion foil, (2) an electron scavenger, and (3) a debris shield. Several laboratories, including Sandia National Laboratories, are developing bremsstrahlung x-ray sources with much larger test areas (approx. 200 to 500 cm 2 ) than ever used before. Accordingly, the debris shield will be much larger than before and subject to loads which could cause shield failure. To prepare for this eventuality, a series of tests were run on the Naval Surface Weapons Center's Casino electron beam accelerator (approx. 1 MeV electrons, 100 ns FWHM pulse, 45 kJ beam energy). The primary goal of these tests was to measure the stress pulse which loads a debris shield. These measurements were made with carbon gages mounted on the back of the converter sandwich. At an electron beam fluence of about 1 kJ/cm 2 , the measured peak compressive stress was typically in the 1 to 2 kbar range. Measured peak compressive stress scaled in a roughly linear manner with fluence level as the fluence level was increased to 10 kJ/cm 2 . The duration of the compressive pulse was on the order of microseconds. In addition to the stress wave measurements, a limited number of tests were made to investigate the type of damage generated in several potential shield materials

  17. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    Science.gov (United States)

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  18. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2013-05-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with some licensed primary users under an interference temperature constraint. We assume that the DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit error rate performance metrics. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an analysis for a random vector quantization design algorithm. Specifically, the approximate statistics functions of the squared inner product between the optimal and quantized vectors are derived. With these statistics, we analyze the outage performance. Furthermore, the effects of channel estimation error and number of primary users on the system performance are investigated. Finally, optimal power adaptation and cochannel interference are considered and analyzed. Numerical and simulation results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  19. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  20. A distribution-free test for anomalous gamma-ray spectra

    International Nuclear Information System (INIS)

    Chan, Kung-sik; Li, Jinzheng; Eichinger, William; Bai, Er-Wei

    2014-01-01

    Gamma-ray spectra are increasingly acquired in monitoring cross-border traffic, or in an area search for lost or orphan special nuclear material (SNM). The signal in such data is generally weak, resulting in poorly resolved spectra, thereby making it hard to detect the presence of SNM. We develop a new test for detecting anomalous spectra by characterizing the complete shape change in a spectrum from background radiation; the proposed method may serve as a tripwire for routine screening for SNM. We show that, with increasing detection time, the limiting distribution of the test is given by some functional of the Brownian bridge. The efficacy of the proposed method is illustrated by simulations. - Highlights: • We develop a new non-parametric test for detecting anomalous gamma-ray spectra. • The proposed test has good empirical power for detecting weak signals. • It can serve as an effective tripwire for invoking more thorough scrutiny of the source

  1. Non-parametric comparison of histogrammed two-dimensional data distributions using the Energy Test

    International Nuclear Information System (INIS)

    Reid, Ivan D; Lopes, Raul H C; Hobson, Peter R

    2012-01-01

    When monitoring complex experiments, comparison is often made between regularly acquired histograms of data and reference histograms which represent the ideal state of the equipment. With the larger HEP experiments now ramping up, there is a need for automation of this task since the volume of comparisons could overwhelm human operators. However, the two-dimensional histogram comparison tools available in ROOT have been noted in the past to exhibit shortcomings. We discuss a newer comparison test for two-dimensional histograms, based on the Energy Test of Aslan and Zech, which provides more conclusive discrimination between histograms of data coming from different distributions than methods provided in a recent ROOT release.

  2. Equilibrium quality and mass flux distributions in an adiabatic three-subchannel test section

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Maganas, A.

    1993-01-01

    An experiment was designed to measure the fully-developed quality and mass flux distributions in an adiabatic three-subchannel test section. The three subchannels had the geometrical characteristics of the corner, side, and interior subchannels of a BWR-5 rod bundle. Data collected with Refrigerant-144 at pressures ranging from 7 to 14 bar, simulating operation with water in the range 55 to 103 bar are reported. The average mass flux and quality in the test section were in the ranges 1300 to 1750 kg/m s and -0.03 to 0.25, respectively. The data are analyzed and presented in various forms

  3. A hydrostatic leak test for water pipeline by using distributed optical fiber vibration sensing system

    Science.gov (United States)

    Wu, Huijuan; Sun, Zhenshi; Qian, Ya; Zhang, Tao; Rao, Yunjiang

    2015-07-01

    A hydrostatic leak test for water pipeline with a distributed optical fiber vibration sensing (DOVS) system based on the phase-sensitive OTDR technology is studied in this paper. By monitoring one end of a common communication optical fiber cable, which is laid in the inner wall of the pipe, we can detect and locate the water leakages easily. Different apertures under different pressures are tested and it shows that the DOVS has good responses when the aperture is equal or larger than 4 mm and the inner pressure reaches 0.2 Mpa for a steel pipe with DN 91cm×EN 2cm.

  4. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  5. Improvement of the CULTEX® exposure technology by radial distribution of the test aerosol.

    Science.gov (United States)

    Aufderheide, Michaela; Heller, Wolf-Dieter; Krischenowski, Olaf; Möhle, Niklas; Hochrainer, Dieter

    2017-07-05

    The exposure of cellular based systems cultivated on microporous membranes at the air-liquid interface (ALI) has been accepted as an appropriate approach to simulate the exposure of cells of the respiratory tract to native airborne substances. The efficiency of such an exposure procedure with regard to stability and reproducibility depends on the optimal design at the interface between the cellular test system and the exposure technique. The actual exposure systems favor the dynamic guidance of the airborne substances to the surface of the cells in specially designed exposure devices. Two module types, based on a linear or radial feed of the test atmosphere to the test system, were used for these studies. In our technical history, the development started with the linear designed version, the CULTEX ® glass modules, fulfilling basic requirements for running ALI exposure studies (Mohr and Durst, 2005). The instability in the distribution of different atmospheres to the cells caused us to create a new exposure module, characterized by a stable and reproducible radial guidance of the aerosol to the cells. The outcome was the CULTEX ® RFS (Mohr et al., 2010). In this study, we describe the differences between the two systems with regard to particle distribution and deposition clarifying the advantages and disadvantages of a radial to a linear aerosol distribution concept. Copyright © 2017 Elsevier GmbH. All rights reserved.

  6. Testing collinear factorization and nuclear parton distributions with pA collisions at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Quiroga-Arias, Paloma [Departamento de Fisica de PartIculas and IGFAE, Universidade de Santiago de Compostela 15706 Santiago de Compostela (Spain); Milhano, Jose Guilherme [CENTRA, Departamento de Fisica, Instituto Superior Tecnico (IST), Av. Rovisco Pais 1, P-1049-001 Lisboa (Portugal); Wiedemann, Urs Achim, E-mail: pquiroga@fpaxpl.usc.es [Physics Department, Theory Unit, CERN, CH-1211 Geneve 23 (Switzerland)

    2011-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non- linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program at the LHC would provide a set of measurements allowing for unprecedented tests of the factorization assumption underlying global nPDF fits.

  7. Testing nuclear parton distributions with pA collisions at the TeV scale

    International Nuclear Information System (INIS)

    Quiroga-Arias, Paloma; Milhano, Jose Guilherme; Wiedemann, Urs Achim

    2010-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distribution functions (nPDFs) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of nonlinear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here, we argue that a proton-nucleus collision program at the Large Hadron Collider would provide a set of measurements, which allow for unprecedented tests of the factorization assumption, underlying global nPDF fits.

  8. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  9. Finite element analysis of thermal stress distribution in different ...

    African Journals Online (AJOL)

    Nigerian Journal of Clinical Practice. Journal Home ... Von Mises and thermal stress distributions were evaluated. Results: In all ... distribution. Key words: Amalgam, finite element method, glass ionomer cement, resin composite, thermal stress ...

  10. Children's Perceptions of Tests: A Content Analysis

    Science.gov (United States)

    Bulgan, Gokce

    2018-01-01

    Anxiety that students experience during test taking negatively influences their academic achievement. Understanding how students perceive tests and how they feel during test taking could help in taking effective preventive measures. Hence, the current study focused on assessing children's perceptions of tests using content analysis. The sample…

  11. Analysis of the Astronomy Diagnostic Test

    Science.gov (United States)

    Brogt, Erik; Sabers, Darrell; Prather, Edward E.; Deming, Grace L.; Hufnagel, Beth; Slater, Timothy F.

    2007-01-01

    Seventy undergraduate class sections were examined from the database of Astronomy Diagnostic Test (ADT) results of Deming and Hufnagel to determine if course format correlated with ADT normalized gain scores. Normalized gains were calculated for four different classroom scenarios: lecture, lecture with discussion, lecture with lab, and lecture…

  12. Stability analysis of distributed order fractional chen system.

    Science.gov (United States)

    Aminikhah, H; Refahi Sheikhani, A; Rezazadeh, H

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results.

  13. Stability Analysis of Distributed Order Fractional Chen System

    Science.gov (United States)

    Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

  14. Immunochromatographic diagnostic test analysis using Google Glass.

    Science.gov (United States)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2014-03-25

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.

  15. Factory Gate Pricing: An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    H.M. le Blanc; F. Cruijssen (Frans); H.A. Fleuren; M.B.M. de Koster (René)

    2004-01-01

    textabstractFactory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution. Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers. Owing to both the asymmetry in the distribution networks

  16. Factory Gate Pricing : An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    Le Blanc, H.M.; Cruijssen, F.C.A.M.; Fleuren, H.A.; de Koster, M.B.M.

    2004-01-01

    Factory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution.Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers.Owing to both the asymmetry in the distribution networks (the supplier

  17. Data Link Test and Analysis System/ATCRBS Transponder Test System Technical Reference

    Science.gov (United States)

    1990-05-01

    This document references material for personnel using or making software changes : to the Data Link Test and Analysis System (DATAS) for Air Traffic Control Radar : Beacon System (ATCRBS) transponder testing and data collection. This is one of : a se...

  18. Non-regularized inversion method from light scattering applied to ferrofluid magnetization curves for magnetic size distribution analysis

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2014-01-01

    A numerical inversion method known from the analysis of light scattering by colloidal dispersions is now applied to magnetization curves of ferrofluids. The distribution of magnetic particle sizes or dipole moments is determined without assuming that the distribution is unimodal or of a particular shape. The inversion method enforces positive number densities via a non-negative least squares procedure. It is tested successfully on experimental and simulated data for ferrofluid samples with known multimodal size distributions. The created computer program MINORIM is made available on the web. - Highlights: • A method from light scattering is applied to analyze ferrofluid magnetization curves. • A magnetic size distribution is obtained without prior assumption of its shape. • The method is tested successfully on ferrofluids with a known size distribution. • The practical limits of the method are explored with simulated data including noise. • This method is implemented in the program MINORIM, freely available online

  19. On the state of acoustic emission analysis in pressure vessel and model vessel testing

    International Nuclear Information System (INIS)

    Morgner, W.; Theis, K.; Henke, F.; Imhof, D.

    1985-01-01

    In the GDR acoustic emission analysis is being applied primarily in connection with hydraulic pressure testing of vessels in chemical industry. It is, however, also used for testing and monitoring of equipment and components in other branches of industry. The state-of-the-art is presented with regard to equipment needed, training of personnel, licensing of testing methods and appropriate testing procedures. In particular, the evaluation of the sum curves and amplitude distributions is explained, using rupture tests of two oxygen cylinders and a compressed-air bottle as examples. (author)

  20. Statistical Analysis of Wave Climate Data Using Mixed Distributions and Extreme Wave Prediction

    Directory of Open Access Journals (Sweden)

    Wei Li

    2016-05-01

    Full Text Available The investigation of various aspects of the wave climate at a wave energy test site is essential for the development of reliable and efficient wave energy conversion technology. This paper presents studies of the wave climate based on nine years of wave observations from the 2005–2013 period measured with a wave measurement buoy at the Lysekil wave energy test site located off the west coast of Sweden. A detailed analysis of the wave statistics is investigated to reveal the characteristics of the wave climate at this specific test site. The long-term extreme waves are estimated from applying the Peak over Threshold (POT method on the measured wave data. The significant wave height and the maximum wave height at the test site for different return periods are also compared. In this study, a new approach using a mixed-distribution model is proposed to describe the long-term behavior of the significant wave height and it shows an impressive goodness of fit to wave data from the test site. The mixed-distribution model is also applied to measured wave data from four other sites and it provides an illustration of the general applicability of the proposed model. The methodologies used in this paper can be applied to general wave climate analysis of wave energy test sites to estimate extreme waves for the survivability assessment of wave energy converters and characterize the long wave climate to forecast the wave energy resource of the test sites and the energy production of the wave energy converters.

  1. Cerebrospinal Fluid (CSF) Analysis: MedlinePlus Lab Test Information

    Science.gov (United States)

    ... K. Brunner & Suddarth's Handbook of Laboratory and Diagnostic Tests. 2nd Ed, Kindle. Philadelphia: Wolters Kluwer Health, Lippincott Williams & Wilkins; c2014. Cerebrospinal Fluid Analysis; 144 p. Johns ...

  2. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  3. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  4. Dynamic models for transient stability analysis of transmission and distribution systems with distributed generation : an overview

    NARCIS (Netherlands)

    Boemer, J.C.; Gibescu, M.; Kling, W.L.

    2009-01-01

    Distributed Generation is increasing in nowadays power systems. Small scale systems such as photovoltaic, biomass or small cogeneration plants are connected to the distribution level, while large wind farms will be connected to the transmission level. Both trends lead to a replacement of large

  5. Development of a web service for analysis in a distributed network.

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  6. Development of a Web Service for Analysis in a Distributed Network

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  7. Pressure Distribution Tests on a Series of Clark Y Biplane Cellules with Special Reference to Stability

    Science.gov (United States)

    Noyes, Richard W

    1933-01-01

    The pressure distribution data discussed in this report represents the results of part of an investigation conducted on the factors affecting the aerodynamic safety of airplanes. The present tests were made on semispan, circular-tipped Clark Y airfoil models mounted in the conventional manner on a separation plane. Pressure readings were made simultaneously at all test orifices at each of 20 angles of attack between -8 degrees and +90 degrees. The results of the tests on each wing arrangement are compared on the bases of maximum normal force coefficient, lateral stability at a low rate of roll, and relative longitudinal stability. Tabular data are also presented giving the center of pressure location of each wing.

  8. Rod internal pressure quantification and distribution analysis using Frapcon

    Energy Technology Data Exchange (ETDEWEB)

    Jessee, Matthew Anderson [ORNL; Wieselquist, William A [ORNL; Ivanov, Kostadin [Pennsylvania State University, University Park

    2015-09-01

    This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd

  9. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Qualification test and analysis report: solar collectors

    Energy Technology Data Exchange (ETDEWEB)

    1978-12-01

    Test results show that the Owens-Illinois Sunpak/sup TM/ Model SEC 601 air-cooled collector meets the national standards and codes as defined in the Subsystem Performance Specification and Verification Plan of NASA/MSFC Contract NAS8-32259, dated October 28, 1976. The architectural and engineering firm of Smith, Hinchman and Grylls, Detroit, Michigan, acted in the capacity of the independent certification agency. The program calls for the development, fabrication, qualification and delivery of an air-liquid solar collector for solar heating, combined heating and cooling, and/or hot water systems.

  11. The ganga user interface for physics analysis and distributed resources

    CERN Document Server

    Soroko, A; Adams, D; Harrison, K; Charpentier, P; Maier, A; Mato, P; Moscicki, J T; Egede, U; Martyniak, J; Jones, R; Patrick, G N

    2004-01-01

    A physicist analysing data from the LHC experiments will have to deal with data and computing resources that are distributed across multiple locations and have different access methods. Ganga helps by providing a uniform high-level interface to the different low-level solutions for the required tasks, ranging from the specification of input data to the retrieval and post-processing of the output. For LHCb and ATLAS the goal is to assist in running jobs based on the Gaudi/Athena C++ framework. Ganga is written in python and presents the user with a single GUI rather than a set of different applications. It uses pluggable modules to interact with external tools for operations such as querying metadata catalogues, job configuration and job submission. At start-up, the user is presented with a list of templates for common analysis tasks, and information about ongoing tasks is stored from one invocation to the next. Ganga can also be used through a command line interface. This closely mirrors the functionality of ...

  12. Reliability analysis applied to structural tests

    Science.gov (United States)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  13. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  14. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  15. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  16. System related testing and analysis of FRECOPA

    International Nuclear Information System (INIS)

    Durin, C.

    1992-01-01

    Results from the French Cooperative Payload (FRECOPA) system analysis are presented. It was one of the numerous experiments which were flown on the Long Duration Exposure Facility (LDEF) satellite. In our flight configuration (LEO orbit, trailing edge), the environment was a better vacuum than the leading edge, with many thermal cycles (32000) and a large amount of UV radiation (11100 equivalent sun hours). Also, the satellite was mainly bombarded by micro-particles. It saw a low atomic flux and minor doses of protons and electrons

  17. Speed testing of Sliding spectrum analysis

    International Nuclear Information System (INIS)

    Frenski, Emil; Manolev, Dimitar

    2013-01-01

    The standard method for spectrum analysis in DSP is the Discrete Fourier transform (DFT), typically implemented using a Fast Fourier transform (FFT) algorithm. The reconstruction of the time-domain signal is then performed by the IFFT (Inverse Fast Fourier transform) algorithm. The FFT calculates the spectral components in a window, on a block-by-block basis. If that window is move by one sample, it is obvious that most of the information will remain the same. This article shows how to measure execution time of scripts realizing SDFT algorithm written for MatLab

  18. A Distributional Analysis of the Gender Wage Gap in Bangladesh

    OpenAIRE

    Salma Ahmed; Pushkar Maitra

    2011-01-01

    This paper decomposes the gender wage gap along the entire wage distribution into an endowment effect and a discrimination effect, taking into account possible selection into full-time employment. Applying a new decomposition approach to the Bangladesh Labour Force Survey (LFS) data we find that women are paid less than men every where on the wage distribution and the gap is higher at the lower end of the distribution. Discrimination against women is the primary determinant of the wage gap. W...

  19. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  20. Development and testing of a diagnostic system for intelligen distributed control at EBR-2

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ruhl, D.W.; Klevans, E.H.; Robinson, G.E.

    1990-01-01

    A diagnostic system is under development for demonstration of Intelligent Distributed Control at the Experimental Breeder Reactor (EBR--II). In the first phase of the project a diagnostic system is being developed for the EBR-II steam plant based on the DISYS expert systems approach. Current testing uses recorded plant data and data from simulated plant faults. The dynamical simulation of the EBR-II steam plant uses the Babcock and Wilcox (B ampersand W) Modular Modeling System (MMS). At EBR-II the diagnostic system operates in the UNIX workstation and receives live plant data from the plant Data Acquisition System (DAS). Future work will seek implementation of the steam plant diagnostic in a distributed manner using UNIX based computers and Bailey microprocessor-based control system. 10 refs., 6 figs

  1. Field test of a continuous-variable quantum key distribution prototype

    International Nuclear Information System (INIS)

    Fossier, S; Debuisschert, T; Diamanti, E; Villing, A; Tualle-Brouri, R; Grangier, P

    2009-01-01

    We have designed and realized a prototype that implements a continuous-variable quantum key distribution (QKD) protocol based on coherent states and reverse reconciliation. The system uses time and polarization multiplexing for optimal transmission and detection of the signal and phase reference, and employs sophisticated error-correction codes for reconciliation. The security of the system is guaranteed against general coherent eavesdropping attacks. The performance of the prototype was tested over preinstalled optical fibres as part of a quantum cryptography network combining different QKD technologies. The stable and automatic operation of the prototype over 57 h yielded an average secret key distribution rate of 8 kbit s -1 over a 3 dB loss optical fibre, including the key extraction process and all quantum and classical communication. This system is therefore ideal for securing communications in metropolitan size networks with high-speed requirements.

  2. Geometry of the q-exponential distribution with dependent competing risks and accelerated life testing

    Science.gov (United States)

    Zhang, Fode; Shi, Yimin; Wang, Ruibing

    2017-02-01

    In the information geometry suggested by Amari (1985) and Amari et al. (1987), a parametric statistical model can be regarded as a differentiable manifold with the parameter space as a coordinate system. Note that the q-exponential distribution plays an important role in Tsallis statistics (see Tsallis, 2009), this paper investigates the geometry of the q-exponential distribution with dependent competing risks and accelerated life testing (ALT). A copula function based on the q-exponential function, which can be considered as the generalized Gumbel copula, is discussed to illustrate the structure of the dependent random variable. Employing two iterative algorithms, simulation results are given to compare the performance of estimations and levels of association under different hybrid progressively censoring schemes (HPCSs).

  3. Wavefront-error evaluation by mathematical analysis of experimental Foucault-test data

    Science.gov (United States)

    Wilson, R. G.

    1975-01-01

    The diffraction theory of the Foucault test provides an integral formula expressing the complex amplitude and irradiance distribution in the Foucault pattern of a test mirror (lens) as a function of wavefront error. Recent literature presents methods of inverting this formula to express wavefront error in terms of irradiance in the Foucault pattern. The present paper describes a study in which the inversion formulation was applied to photometric Foucault-test measurements on a nearly diffraction-limited mirror to determine wavefront errors for direct comparison with ones determined from scatter-plate interferometer measurements. The results affirm the practicability of the Foucault test for quantitative wavefront analysis of very small errors, and they reveal the fallacy of the prevalent belief that the test is limited to qualitative use only. Implications of the results with regard to optical testing and the potential use of the Foucault test for wavefront analysis in orbital space telescopes are discussed.

  4. Core test reactor shield cooling system analysis

    International Nuclear Information System (INIS)

    Larson, E.M.; Elliott, R.D.

    1971-01-01

    System requirements for cooling the shield within the vacuum vessel for the core test reactor are analyzed. The total heat to be removed by the coolant system is less than 22,700 Btu/hr, with an additional 4600 Btu/hr to be removed by the 2-inch thick steel plate below the shield. The maximum temperature of the concrete in the shield can be kept below 200 0 F if the shield plug walls are kept below 160 0 F. The walls of the two ''donut'' shaped shield segments, which are cooled by the water from the shield and vessel cooling system, should operate below 95 0 F. The walls of the center plug, which are cooled with nitrogen, should operate below 100 0 F. (U.S.)

  5. Parametric Analysis of a Hover Test Vehicle using Advanced Test Generation and Data Analysis

    Science.gov (United States)

    Gundy-Burlet, Karen; Schumann, Johann; Menzies, Tim; Barrett, Tony

    2009-01-01

    Large complex aerospace systems are generally validated in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. This is due to the large parameter space, and complex, highly coupled nonlinear nature of the different systems that contribute to the performance of the aerospace system. We have addressed the factors deterring such an analysis by applying a combination of technologies to the area of flight envelop assessment. We utilize n-factor (2,3) combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. The data generated is automatically analyzed through a combination of unsupervised learning using a Bayesian multivariate clustering technique (AutoBayes) and supervised learning of critical parameter ranges using the machine-learning tool TAR3, a treatment learner. Covariance analysis with scatter plots and likelihood contours are used to visualize correlations between simulation parameters and simulation results, a task that requires tool support, especially for large and complex models. We present results of simulation experiments for a cold-gas-powered hover test vehicle.

  6. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  8. Repeatability study of replicate crash tests: A signal analysis approach.

    Science.gov (United States)

    Seppi, Jeremy; Toczyski, Jacek; Crandall, Jeff R; Kerrigan, Jason

    2017-10-03

    To provide an objective basis on which to evaluate the repeatability of vehicle crash test methods, a recently developed signal analysis method was used to evaluate correlation of sensor time history data between replicate vehicle crash tests. The goal of this study was to evaluate the repeatability of rollover crash tests performed with the Dynamic Rollover Test System (DRoTS) relative to other vehicle crash test methods. Test data from DRoTS tests, deceleration rollover sled (DRS) tests, frontal crash tests, frontal offset crash tests, small overlap crash tests, small overlap impact (SOI) crash tests, and oblique crash tests were obtained from the literature and publicly available databases (the NHTSA vehicle database and the Insurance Institute for Highway Safety TechData) to examine crash test repeatability. Signal analysis of the DRoTS tests showed that force and deformation time histories had good to excellent repeatability, whereas vehicle kinematics showed only fair repeatability due to the vehicle mounting method for one pair of tests and slightly dissimilar mass properties (2.2%) in a second pair of tests. Relative to the DRS, the DRoTS tests showed very similar or higher levels of repeatability in nearly all vehicle kinematic data signals with the exception of global X' (road direction of travel) velocity and displacement due to the functionality of the DRoTS fixture. Based on the average overall scoring metric of the dominant acceleration, DRoTS was found to be as repeatable as all other crash tests analyzed. Vertical force measures showed good repeatability and were on par with frontal crash barrier forces. Dynamic deformation measures showed good to excellent repeatability as opposed to poor repeatability seen in SOI and oblique deformation measures. Using the signal analysis method as outlined in this article, the DRoTS was shown to have the same or better repeatability of crash test methods used in government regulatory and consumer evaluation test

  9. Testing DARKexp against energy and density distributions of Millennium-II halos

    Energy Technology Data Exchange (ETDEWEB)

    Nolting, Chris; Williams, Liliya L.R. [School of Physics and Astronomy, University of Minnesota, 116 Church Street SE, Minneapolis, MN, 55454 (United States); Boylan-Kolchin, Michael [Department of Astronomy, The University of Texas at Austin, 2515 Speedway, Stop C1400, Austin, TX, 78712 (United States); Hjorth, Jens, E-mail: nolting@astro.umn.edu, E-mail: llrw@astro.umn.edu, E-mail: mbk@astro.as.utexas.edu, E-mail: jens@dark-cosmology.dk [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, Copenhagen, DK-2100 Denmark (Denmark)

    2016-09-01

    We test the DARKexp model for relaxed, self-gravitating, collisionless systems against equilibrium dark matter halos from the Millennium-II simulation. While limited tests of DARKexp against simulations and observations have been carried out elsewhere, this is the first time the testing is done with a large sample of simulated halos spanning a factor of ∼ 50 in mass, and using independent fits to density and energy distributions. We show that DARKexp, a one shape parameter family, provides very good fits to the shapes of density profiles, ρ( r ), and differential energy distributions, N ( E ), of individual simulated halos. The best fit shape parameter φ{sub 0} obtained from the two types of fits are correlated, though with scatter. Our most important conclusions come from ρ( r ) and N ( E ) that have been averaged over many halos. These show that the bulk of the deviations between DARKexp and individual Millennium-II halos come from halo-to-halo fluctuations, likely driven by substructure, and other density perturbations. The average ρ( r ) and N ( E ) are quite smooth and follow DARKexp very closely. The only deviation that remains after averaging is small, and located at most bound energies for N ( E ) and smallest radii for ρ( r ). Since the deviation is confined to 3–4 smoothing lengths, and is larger for low mass halos, it is likely due to numerical resolution effects.

  10. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  11. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  12. Testing the lognormality of the galaxy and weak lensing convergence distributions from Dark Energy Survey maps

    International Nuclear Information System (INIS)

    Clerkin, L.; Kirk, D.; Manera, M.; Lahav, O.; Abdalla, F.

    2016-01-01

    It is well known that the probability distribution function (PDF) of galaxy density contrast is approximately lognormal; whether the PDF of mass fluctuations derived from weak lensing convergence (κWL) is lognormal is less well established. We derive PDFs of the galaxy and projected matter density distributions via the counts-in-cells (CiC) method. We use maps of galaxies and weak lensing convergence produced from the Dark Energy Survey Science Verification data over 139 deg"2. We test whether the underlying density contrast is well described by a lognormal distribution for the galaxies, the convergence and their joint PDF. We confirm that the galaxy density contrast distribution is well modelled by a lognormal PDF convolved with Poisson noise at angular scales from 10 to 40 arcmin (corresponding to physical scales of 3–10 Mpc). We note that as κWL is a weighted sum of the mass fluctuations along the line of sight, its PDF is expected to be only approximately lognormal. We find that the κWL distribution is well modelled by a lognormal PDF convolved with Gaussian shape noise at scales between 10 and 20 arcmin, with a best-fitting χ"2/dof of 1.11 compared to 1.84 for a Gaussian model, corresponding to p-values 0.35 and 0.07, respectively, at a scale of 10 arcmin. Above 20 arcmin a simple Gaussian model is sufficient. The joint PDF is also reasonably fitted by a bivariate lognormal. As a consistency check, we compare the variances derived from the lognormal modelling with those directly measured via CiC. Lastly, our methods are validated against maps from the MICE Grand Challenge N-body simulation.

  13. Field tests applying multi-agent technology for distributed control. Virtual power plants and wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Schaeffer, G.J.; Warmer, C.J.; Hommelberg, M.P.F.; Kamphuis, I.G.; Kok, J.K. [Energy in the Built Environment and Networks, Petten (Netherlands)

    2007-01-15

    Multi-agent technology is state of the art ICT. It is not yet widely applied in power control systems. However, it has a large potential for bottom-up, distributed control of a network with large-scale renewable energy sources (RES) and distributed energy resources (DER) in future power systems. At least two major European R and D projects (MicroGrids and CRISP) have investigated its potential. Both grid-related as well as market-related applications have been studied. This paper will focus on two field tests, performed in the Netherlands, applying multi-agent control by means of the PowerMatcher concept. The first field test focuses on the application of multi-agent technology in a commercial setting, i.e. by reducing the need for balancing power in the case of intermittent energy sources, such as wind energy. In this case the flexibility is used of demand and supply of industrial and residential consumers and producers. Imbalance reduction rates of over 40% have been achieved applying the PowerMatcher, and with a proper portfolio even larger rates are expected. In the second field test the multi-agent technology is used in the design and implementation of a virtual power plant (VPP). This VPP digitally connects a number of micro-CHP units, installed in residential dwellings, into a cluster that is controlled to reduce the local peak demand of the common low-voltage grid segment the micro-CHP units are connected to. In this way the VPP supports the local distribution system operator (DSO) to defer reinforcements in the grid infrastructure (substations and cables)

  14. Field tests applying multi-agent technology for distributed control. Virtual power plants and wind energy

    International Nuclear Information System (INIS)

    Schaeffer, G.J.; Warmer, C.J.; Hommelberg, M.P.F.; Kamphuis, I.G.; Kok, J.K.

    2007-01-01

    Multi-agent technology is state of the art ICT. It is not yet widely applied in power control systems. However, it has a large potential for bottom-up, distributed control of a network with large-scale renewable energy sources (RES) and distributed energy resources (DER) in future power systems. At least two major European R and D projects (MicroGrids and CRISP) have investigated its potential. Both grid-related as well as market-related applications have been studied. This paper will focus on two field tests, performed in the Netherlands, applying multi-agent control by means of the PowerMatcher concept. The first field test focuses on the application of multi-agent technology in a commercial setting, i.e. by reducing the need for balancing power in the case of intermittent energy sources, such as wind energy. In this case the flexibility is used of demand and supply of industrial and residential consumers and producers. Imbalance reduction rates of over 40% have been achieved applying the PowerMatcher, and with a proper portfolio even larger rates are expected. In the second field test the multi-agent technology is used in the design and implementation of a virtual power plant (VPP). This VPP digitally connects a number of micro-CHP units, installed in residential dwellings, into a cluster that is controlled to reduce the local peak demand of the common low-voltage grid segment the micro-CHP units are connected to. In this way the VPP supports the local distribution system operator (DSO) to defer reinforcements in the grid infrastructure (substations and cables)

  15. Pi-Sat: A Low Cost Small Satellite and Distributed Spacecraft Mission System Test Platform

    Science.gov (United States)

    Cudmore, Alan

    2015-01-01

    Current technology and budget trends indicate a shift in satellite architectures from large, expensive single satellite missions, to small, low cost distributed spacecraft missions. At the center of this shift is the SmallSatCubesat architecture. The primary goal of the Pi-Sat project is to create a low cost, and easy to use Distributed Spacecraft Mission (DSM) test bed to facilitate the research and development of next-generation DSM technologies and concepts. This test bed also serves as a realistic software development platform for Small Satellite and Cubesat architectures. The Pi-Sat is based on the popular $35 Raspberry Pi single board computer featuring a 700Mhz ARM processor, 512MB of RAM, a flash memory card, and a wealth of IO options. The Raspberry Pi runs the Linux operating system and can easily run Code 582s Core Flight System flight software architecture. The low cost and high availability of the Raspberry Pi make it an ideal platform for a Distributed Spacecraft Mission and Cubesat software development. The Pi-Sat models currently include a Pi-Sat 1U Cube, a Pi-Sat Wireless Node, and a Pi-Sat Cubesat processor card.The Pi-Sat project takes advantage of many popular trends in the Maker community including low cost electronics, 3d printing, and rapid prototyping in order to provide a realistic platform for flight software testing, training, and technology development. The Pi-Sat has also provided fantastic hands on training opportunities for NASA summer interns and Pathways students.

  16. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  17. Evaluation of power control concepts using the PMAD systems test bed. [Power Management and Distribution

    Science.gov (United States)

    Beach, R. F.; Kimnach, G. L.; Jett, T. A.; Trash, L. M.

    1989-01-01

    The Lewis Research Center's Power Management and Distribution (PMAD) System testbed and its use in the evaluation of control concepts applicable to the NASA Space Station Freedom electric power system (EPS) are described. The facility was constructed to allow testing of control hardware and software in an environment functionally similar to the space station electric power system. Control hardware and software have been developed to allow operation of the testbed power system in a manner similar to a supervisory control and data acquisition (SCADA) system employed by utility power systems for control. The system hardware and software are described.

  18. Acceptance Sampling Plans Based on Truncated Life Tests for Sushila Distribution

    Directory of Open Access Journals (Sweden)

    Amer Ibrahim Al-Omari

    2018-03-01

    Full Text Available An acceptance sampling plan problem based on truncated life tests when the lifetime following a Sushila distribution is considered in this paper. For various acceptance numbers, confidence levels and values of the ratio between fixed experiment time and particular mean lifetime, the minimum sample sizes required to ascertain a specified mean life were found. The operating characteristic function values of the suggested sampling plans and the producer’s risk are presented. Some tables are provided and the results are illustrated by an example of a real data set.

  19. Vascular plants of the Nevada Test Site and Central-Southern Nevada: ecologic and geographic distributions

    Energy Technology Data Exchange (ETDEWEB)

    Beatley, J.C.

    1976-01-01

    The physical environment of the Nevada Test Site and surrounding area is described with regard to physiography, geology, soils, and climate. A discussion of plant associations is given for the Mojave Desert, Transition Desert, and Great Basin Desert. The vegetation of disturbed sites is discussed with regard to introduced species as well as endangered and threatened species. Collections of vascular plants were made during 1959 to 1975. The plants, belonging to 1093 taxa and 98 families are listed together with information concerning ecologic and geographic distributions. Indexes to families, genera, and species are included. (HLW)

  20. Collider detector beam line test table: a structural analysis

    International Nuclear Information System (INIS)

    Leininger, M.B.

    1983-01-01

    The apparatus which sweeps calorimeter and endwall modules through the beam during testing is called a beam line test table. Because of rather stringent requirements for the physical positioning of the modules an analysis is done here to determine the modifications to the current test table design which will minimize deflections of the table under load

  1. Analysis and optimization of blood-testing procedures.

    NARCIS (Netherlands)

    Bar-Lev, S.K.; Boxma, O.J.; Perry, D.; Vastazos, L.P.

    2017-01-01

    This paper is devoted to the performance analysis and optimization of blood testing procedures. We present a queueing model of two queues in series, representing the two stages of a blood-testing procedure. Service (testing) in stage 1 is performed in batches, whereas it is done individually in

  2. An adaptive Mantel-Haenszel test for sensitivity analysis in observational studies.

    Science.gov (United States)

    Rosenbaum, Paul R; Small, Dylan S

    2017-06-01

    In a sensitivity analysis in an observational study with a binary outcome, is it better to use all of the data or to focus on subgroups that are expected to experience the largest treatment effects? The answer depends on features of the data that may be difficult to anticipate, a trade-off between unknown effect-sizes and known sample sizes. We propose a sensitivity analysis for an adaptive test similar to the Mantel-Haenszel test. The adaptive test performs two highly correlated analyses, one focused analysis using a subgroup, one combined analysis using all of the data, correcting for multiple testing using the joint distribution of the two test statistics. Because the two component tests are highly correlated, this correction for multiple testing is small compared with, for instance, the Bonferroni inequality. The test has the maximum design sensitivity of two component tests. A simulation evaluates the power of a sensitivity analysis using the adaptive test. Two examples are presented. An R package, sensitivity2x2xk, implements the procedure. © 2016, The International Biometric Society.

  3. TRACG post-test analysis of panthers prototype tests of SBWR passive containment condenser

    International Nuclear Information System (INIS)

    Fitch, J.R.; Billig, P.F.; Abdollahian, D.; Masoni, P.

    1997-01-01

    As part of the validation effort for application of the TRACG code to the Simplified Boiling Water Reactor (SBWR), calculations have been performed for the various test facilities which are part of the SBWR design and technology certification program. These calculations include post-test calculations for tests in the PANTHERS Passive Containment Condenser (PCC) test program. Sixteen tests from the PANTHERS/PCC test matrix were selected for post-test analysis. This set includes three steady-state pure-steam tests, nine steady-state steam-air tests, and four transient tests. The purpose of this paper is to present and discuss the results of the post-test analysis. The author includes a brief description of the PANTHERS/PCC test facility and test matrix, a description of the PANTHERS/PCC post-test TRACG model and the manner in which the various types of tests in the post-test evaluation were simulated, and a presentation of the results of the TRACG simulation

  4. ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores

    Science.gov (United States)

    Allalouf, Avi

    2014-01-01

    The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…

  5. Correlation analysis for forced vibration test of the Hualien large scale seismic test (LSST) program

    International Nuclear Information System (INIS)

    Sugawara, Y.; Sugiyama, T.; Kobayashi, T.; Yamaya, H.; Kitamura, E.

    1995-01-01

    The correlation analysis for a forced vibration test of a 1/4-scale containment SSI test model constructed in Hualien, Taiwan was carried out for the case of after backfilling. Prior to this correlation analysis, the structural properties were revised to adjust the calculated fundamental frequency in the fixed base condition to that derived from the test results. A correlation analysis was carried out using the Lattice Model which was able to estimate the soil-structure effects with embedment. The analysis results coincide well with test results and it is concluded that the mathematical soil-structure interaction model established by the correlation analysis is efficient in estimating the dynamic soil-structure interaction effect with embedment. This mathematical model will be applied as a basic model for simulation analysis of earthquake observation records. (author). 3 refs., 12 figs., 2 tabs

  6. Equilibrium quality and mass flux distributions in an adiabatic three-subchannel test section

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Maganas, A.

    1995-01-01

    An experiment was designed to measure the fully developed quality and mass flux distributions in an adiabatic three-subchannel test section. The three subchannels had the geometrical characteristics of the corner, side, and interior subchannels of a boiling water reactor (BWR-5) rod bundle. Data collected with Refrigerant-114 at pressures ranging from 7 to 14 bars, simulating operation with water in the range 55 to 103 bars are reported. The average mass flux and quality in the test section were in the ranges 1,300 to 1,750 kg/m 2 · s and -0.03 to 0.25, respectively. The data are analyzed and presented in various forms

  7. Skewness and kurtosis analysis for non-Gaussian distributions

    Science.gov (United States)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2018-06-01

    In this paper we address a number of pitfalls regarding the use of kurtosis as a measure of deviations from the Gaussian. We treat kurtosis in both its standard definition and that which arises in q-statistics, namely q-kurtosis. We have recently shown that the relation proposed by Cristelli et al. (2012) between skewness and kurtosis can only be verified for relatively small data sets, independently of the type of statistics chosen; however it fails for sufficiently large data sets, if the fourth moment of the distribution is finite. For infinite fourth moments, kurtosis is not defined as the size of the data set tends to infinity. For distributions with finite fourth moments, the size, N, of the data set for which the standard kurtosis saturates to a fixed value, depends on the deviation of the original distribution from the Gaussian. Nevertheless, using kurtosis as a criterion for deciding which distribution deviates further from the Gaussian can be misleading for small data sets, even for finite fourth moment distributions. Going over to q-statistics, we find that although the value of q-kurtosis is finite in the range of 0 < q < 3, this quantity is not useful for comparing different non-Gaussian distributed data sets, unless the appropriate q value, which truly characterizes the data set of interest, is chosen. Finally, we propose a method to determine the correct q value and thereby to compute the q-kurtosis of q-Gaussian distributed data sets.

  8. Is Middle-Upper Arm Circumference “normally” distributed? Secondary data analysis of 852 nutrition surveys

    Directory of Open Access Journals (Sweden)

    Severine Frison

    2016-05-01

    Full Text Available Abstract Background Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 % are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH and/or low Mid-Upper Arm Circumference (MUAC (since 2005. Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. Methods This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise “non-normal” distributions. Results The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 % distributions using the Shapiro–Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 % were skewed (D’Agostino test and 196 (36.8 % had a kurtosis different to the one observed in the normal distribution (Anscombe–Glynn test. Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 % showed high digit preference, 164 (30.8 % had a large design effect, and 204 (38.3 % a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were “normalised” and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating “normal” after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7

  9. Analysis of discrete and continuous distributions of ventilatory time constants from dynamic computed tomography

    International Nuclear Information System (INIS)

    Doebrich, Marcus; Markstaller, Klaus; Karmrodt, Jens; Kauczor, Hans-Ulrich; Eberle, Balthasar; Weiler, Norbert; Thelen, Manfred; Schreiber, Wolfgang G

    2005-01-01

    In this study, an algorithm was developed to measure the distribution of pulmonary time constants (TCs) from dynamic computed tomography (CT) data sets during a sudden airway pressure step up. Simulations with synthetic data were performed to test the methodology as well as the influence of experimental noise. Furthermore the algorithm was applied to in vivo data. In five pigs sudden changes in airway pressure were imposed during dynamic CT acquisition in healthy lungs and in a saline lavage ARDS model. The fractional gas content in the imaged slice (FGC) was calculated by density measurements for each CT image. Temporal variations of the FGC were analysed assuming a model with a continuous distribution of exponentially decaying time constants. The simulations proved the feasibility of the method. The influence of experimental noise could be well evaluated. Analysis of the in vivo data showed that in healthy lungs ventilation processes can be more likely characterized by discrete TCs whereas in ARDS lungs continuous distributions of TCs are observed. The temporal behaviour of lung inflation and deflation can be characterized objectively using the described new methodology. This study indicates that continuous distributions of TCs reflect lung ventilation mechanics more accurately compared to discrete TCs

  10. Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

    Science.gov (United States)

    Yoo, Paul

    2013-01-01

    Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

  11. Performance Analysis of Radial Distribution Systems with UPQC and D-STATCOM

    Science.gov (United States)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-08-01

    This paper presents an effective method for finding optimum location of unified power quality conditioner (UPQC) and distributed static compensator (D-STATCOM) in radial distribution system. The bus having the minimum losses is selected as the candidate bus for UPQC placement and the optimal location of D-STATCOM is found by power loss index (PLI) method. The PLI values of all the buses are calculated and the bus having the highest PLI value is the most favorable bus and thus selected as candidate bus for D-STATCOM placement. Main contribution of this paper are: (i) finding optimum location of UPQC in radial distribution system (RDS) based on minimum power loss; (ii) finding the optimal size of UPQC which offers minimum losses; (iii) calculation of annual energy saving using UPQC and D-STATCOM; (iv) cost analysis with and without UPQC and D-STATCOM placement; and (v) comparison of results with and without UPQC and D-STATCOM placement in RDS. The algorithm is tested on IEEE 33-bus and 69-bus radial distribution systems by using MATLAB software.

  12. Development of a Test Facility to Simulate the Reactor Flow Distribution of APR+

    International Nuclear Information System (INIS)

    Euh, D. J.; Cho, S.; Youn, Y. J.; Kim, J. T.; Kang, H. S.; Kwon, T. S.

    2011-01-01

    Recently a design of new reactor, APR+, is being developed, as an advanced type of APR1400. In order to analyze the thermal margin and hydraulic characteristics of APR+, quantification tests for flow and pressure distribution with a conservation of flow geometry are necessary. Hetsroni (1967) proposed four principal parameters for a hydraulic model representing a nuclear reactor prototype: geometry, relative roughness, Reynolds number, and Euler number. He concluded that the Euler number should be similar in the prototype and model under the preservation of the aspect ratio on the flow path. The effect of the Reynolds number at its higher values on the Euler number is rather small, since the dependency of the form and frictional loss coefficients on the Reynolds number is seen to be small. ABB-CE has carried out several reactor flow model test programs, mostly for its prototype reactors. A series of tests were conducted using a 3/16 scale reactor model. (see Lee et al., 2001). Lee et al (1991) performed experimental studies using a 1/5.03 scale reactor flow model of Yonggwang nuclear units 3 and 4. They showed that the measured data met the acceptance criteria and were suitable for their intended use in terms of performance and safety analyses. The design of current test facility was based on the conservation of Euler number which is a ratio of pressure drop to dynamic pressure with a sufficiently turbulent region having a high Reynolds number. By referring to the previous study, the APR+ design is linearly reduced to 1/5 ratio with a 1/2 of the velocity scale, which yields a 1/39.7 of Reynolds number scaling ratio. In the present study, the design feature of the facilities, named 'ACOP', in order to investigate flow and pressure distribution are described

  13. Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions

    2017-08-09

    The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.

  14. Wind Tunnel Tests for Wind Pressure Distribution on Gable Roof Buildings

    Science.gov (United States)

    2013-01-01

    Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path. PMID:24082851

  15. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    Science.gov (United States)

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  16. Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar

    Directory of Open Access Journals (Sweden)

    Graham V. Weinberg

    2012-01-01

    Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.

  17. The gluon distribution at small x - a phenomenological analysis

    International Nuclear Information System (INIS)

    Harriman, P.N.; Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1990-03-01

    The size of the gluon distribution at small χ has important implications for phenomenology at future high energy hadron-hadron and lepton-hadron colliders. We extend a recent global parton distribution fit to investigate the constraints on the gluon from deep inelastic and prompt photon data. In particular, we estimate a band of allowed gluon distributions with qualitatively small-χ behaviour and study the implications of these on a variety of cross sections at high energy pp and ep colliders. (author)

  18. Development, Demonstration, and Field Testing of Enterprise-Wide Distributed Generation Energy Management System: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, S.; Cooley, C.

    2005-01-01

    This report details progress on subcontract NAD-1-30605-1 between the National Renewable Energy Laboratory and RealEnergy (RE), the purpose of which is to describe RE's approach to the challenges it faces in the implementation of a nationwide fleet of clean cogeneration systems to serve contemporary energy markets. The Phase 2 report covers: utility tariff risk and its impact on market development; the effect on incentives on distributed energy markets; the regulatory effectiveness of interconnection in California; a survey of practical field interconnection issues; trend analysis for on-site generation; performance of dispatch systems; and information design hierarchy for combined heat and power.

  19. Results and Analysis from Space Suit Joint Torque Testing

    Science.gov (United States)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  20. RECONSTRUCTING REDSHIFT DISTRIBUTIONS WITH CROSS-CORRELATIONS: TESTS AND AN OPTIMIZED RECIPE

    International Nuclear Information System (INIS)

    Matthews, Daniel J.; Newman, Jeffrey A.

    2010-01-01

    Many of the cosmological tests to be performed by planned dark energy experiments will require extremely well-characterized photometric redshift measurements. Current estimates for cosmic shear are that the true mean redshift of the objects in each photo-z bin must be known to better than 0.002(1 + z), and the width of the bin must be known to ∼0.003(1 + z) if errors in cosmological measurements are not to be degraded significantly. A conventional approach is to calibrate these photometric redshifts with large sets of spectroscopic redshifts. However, at the depths probed by Stage III surveys (such as DES), let alone Stage IV (LSST, JDEM, and Euclid), existing large redshift samples have all been highly (25%-60%) incomplete, with a strong dependence of success rate on both redshift and galaxy properties. A powerful alternative approach is to exploit the clustering of galaxies to perform photometric redshift calibrations. Measuring the two-point angular cross-correlation between objects in some photometric redshift bin and objects with known spectroscopic redshift, as a function of the spectroscopic z, allows the true redshift distribution of a photometric sample to be reconstructed in detail, even if it includes objects too faint for spectroscopy or if spectroscopic samples are highly incomplete. We test this technique using mock DEEP2 Galaxy Redshift survey light cones constructed from the Millennium Simulation semi-analytic galaxy catalogs. From this realistic test, which incorporates the effects of galaxy bias evolution and cosmic variance, we find that the true redshift distribution of a photometric sample can, in fact, be determined accurately with cross-correlation techniques. We also compare the empirical error in the reconstruction of redshift distributions to previous analytic predictions, finding that additional components must be included in error budgets to match the simulation results. This extra error contribution is small for surveys that sample

  1. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...

  2. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...

  3. analysis of acidic properties of distribution transformer oil insulation

    African Journals Online (AJOL)

    user

    The system detects when the acid- ... rated above 500 kVA are classed as power transformers. Transformers rated at ... generate great impact in safety, reliability and cost of the electric ... the primary voltage of the electric distribution system to.

  4. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  5. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River ... of taxa was recorded in marginal vegetation in the channels and lagoons, ... highlights the importance of maintaining a mosaic of aquatic habitats in the Delta.

  6. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  7. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River Delta, ... seasonally-flooded pools and temporary rain-filled pools in MGR and CI. ... biodiversity of the Okavango Delta, thereby contributing to its conservation.

  8. Distributed Multiscale Data Analysis and Processing for Sensor Networks

    National Research Council Canada - National Science Library

    Wagner, Raymond; Sarvotham, Shriram; Choi, Hyeokho; Baraniuk, Richard

    2005-01-01

    .... Second, the communication overhead of multiscale algorithms can become prohibitive. In this paper, we take a first step in addressing both shortcomings by introducing two new distributed multiresolution transforms...

  9. Tradespace Analysis Tool for Designing Earth Science Distributed Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The ESTO 2030 Science Vision envisions the future of Earth Science to be characterized by 'many more distributed observations,' and 'formation-flying [missions that]...

  10. Plant management tools tested with a small-scale distributed generation laboratory

    International Nuclear Information System (INIS)

    Ferrari, Mario L.; Traverso, Alberto; Pascenti, Matteo; Massardo, Aristide F.

    2014-01-01

    Highlights: • Thermal grid innovative layouts. • Experimental rig for distributed generation. • Real-time management tool. • Experimental results for plant management. • Comparison with results from an optimization complete software. - Abstract: Optimization of power generation with smart grids is an important issue for extensive sustainable development of distributed generation. Since an experimental approach is essential for implementing validated optimization software, the TPG research team of the University of Genoa has installed a laboratory facility for carrying out studies on polygeneration grids. The facility consists of two co-generation prime movers based on conventional technology: a 100 kWe gas turbine (mGT) and a 20 kWe internal combustion engine (ICE). The rig high flexibility allows the possibility of integration with renewable-source based devices, such as biomass-fed boilers and solar panels. Special attention was devoted to thermal distribution grid design. To ensure the possibility of application in medium-large districts, composed of several buildings including energy users, generators or both, an innovative layout based on two ring pipes was examined. Thermal storage devices were also included in order to have a complete hardware platform suitable for assessing the performance of different management tools. The test presented in this paper was carried out with both the mGT and the ICE connected to this innovative thermal grid, while users were emulated by means of fan coolers controlled by inverters. During this test the plant is controlled by a real-time model capable of calculating a machine performance ranking, which is necessary in order to split power demands between the prime movers (marginal cost decrease objective). A complete optimization tool devised by TPG (ECoMP program) was also used in order to obtain theoretical results considering the same machines and load values. The data obtained with ECoMP were compared with the

  11. Burst Test Qualification Analysis of DWPF Canister-Plug Weld

    International Nuclear Information System (INIS)

    Gupta, N.K.; Gong, Chung.

    1995-02-01

    The DWPF canister closure system uses resistance welding for sealing the canister nozzle and plug to ensure leak tightness. The welding group at SRTC is using the burst test to qualify this seal weld in lieu of the shear test in ASME B ampersand PV Code, Section IX, paragraph QW-196. The burst test is considered simpler and more appropriate than the shear test for this application. Although the geometry, loading and boundary conditions are quite different in the two tests, structural analyses show similarity in the failure mode of the shear test in paragraph QW-196 and the burst test on the DWPF canister nozzle Non-linear structural analyses are performed using finite element techniques to study the failure mode of the two tests. Actual test geometry and realistic stress strain data for the 304L stainless steel and the weld material are used in the analyses. The finite element models are loaded until failure strains are reached. The failure modes in both tests are shear at the failure points. Based on these observations, it is concluded that the use of a burst test in lieu of the shear test for qualifying the canister-plug weld is acceptable. The burst test analysis for the canister-plug also yields the burst pressures which compare favorably with the actual pressure found during burst tests. Thus, the analysis also provides an estimate of the safety margins in the design of these vessels

  12. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  13. Data analysis and mapping of the mountain permafrost distribution

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    the permafrost occurrence where it is unknown, the mentioned supervised learning techniques inferred a classification function from labelled training data (pixels of permafrost absence and presence). A particular attention was given to the pre-processing of the dataset, with the study of its complexity and the relation between permafrost data and employed environmental variables. The application of feature selection techniques completed this analysis and informed about redundant or valueless predictors. Classification performances were assessed with AUROC on independent validation sets (0.81 for LR, 0.85 with SVM and 0.88 with RF). At the micro scale obtained permafrost maps illustrate consistent results compared to the field reality thanks to the high resolution of the dataset (10 meters). Moreover, compared to classical models, the permafrost prediction is computed without recurring to altitude thresholds (above which permafrost may be found). Finally, as machine learning is a non-deterministic approach, mountain permafrost distribution maps are presented and discussed with corresponding uncertainties maps, which provide information on the quality of the results.

  14. Spatial Distribution Analysis of Scrub Typhus in Korea

    OpenAIRE

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does...

  15. Analysis of transverse field distributions in Porro prism resonators

    Science.gov (United States)

    Litvin, Igor A.; Burger, Liesl; Forbes, Andrew

    2007-05-01

    A model to describe the transverse field distribution of the output beam from porro prism resonators is proposed. The model allows the prediction of the output transverse field distribution by assuming that the main areas of loss are located at the apexes of the porro prisms. Experimental work on a particular system showed some interested correlations between the time domain behavior of the resonator and the transverse field output. These findings are presented and discussed.

  16. Analysis of Strengthening Steel Distribution Channel in Domestic Automotive Industry

    OpenAIRE

    Pangraksa, Sugeng; Djajadiningrat, Surna Tjahja

    2013-01-01

    Distribution has strategic role to spread up product from manufacturer into end-user. Automotive industry needs distribution channel which has: excellent data management, timely delivery management, excellent quality management, and competitive reducing cost. Krakatau Steel (KS) distributors has weaknesses to enter automotive market current that require tight prerequisite such as: consistency of product quality, good cooperation, close relationship, continuously cost reduction, wide spread to...

  17. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  18. Distributed generation: An empirical analysis of primary motivators

    International Nuclear Information System (INIS)

    Carley, Sanya

    2009-01-01

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  19. Distributed generation: An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)], E-mail: scarley@email.unc.edu

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  20. Distributed generation. An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment. (author)

  1. A computational approach to discovering the functions of bacterial phytochromes by analysis of homolog distributions

    Directory of Open Access Journals (Sweden)

    Lamparter Tilman

    2006-03-01

    Full Text Available Abstract Background Phytochromes are photoreceptors, discovered in plants, that control a wide variety of developmental processes. They have also been found in bacteria and fungi, but for many species their biological role remains obscure. This work concentrates on the phytochrome system of Agrobacterium tumefaciens, a non-photosynthetic soil bacterium with two phytochromes. To identify proteins that might share common functions with phytochromes, a co-distribution analysis was performed on the basis of protein sequences from 138 bacteria. Results A database of protein sequences from 138 bacteria was generated. Each sequence was BLASTed against the entire database. The homolog distribution of each query protein was then compared with the homolog distribution of every other protein (target protein of the same species, and the target proteins were sorted according to their probability of co-distribution under random conditions. As query proteins, phytochromes from Agrobacterium tumefaciens, Pseudomonas aeruginosa, Deinococcus radiodurans and Synechocystis PCC 6803 were chosen along with several phytochrome-related proteins from A. tumefaciens. The Synechocystis photosynthesis protein D1 was selected as a control. In the D1 analyses, the ratio between photosynthesis-related proteins and those not related to photosynthesis among the top 150 in the co-distribution tables was > 3:1, showing that the method is appropriate for finding partner proteins with common functions. The co-distribution of phytochromes with other histidine kinases was remarkably high, although most co-distributed histidine kinases were not direct BLAST homologs of the query protein. This finding implies that phytochromes and other histidine kinases share common functions as parts of signalling networks. All phytochromes tested, with one exception, also revealed a remarkably high co-distribution with glutamate synthase and methionine synthase. This result implies a general role of

  2. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  3. Spectrum analysis of radiotracer residence time distribution for industrial and environmental applications

    International Nuclear Information System (INIS)

    Kasban, H.; Ashraf Hamid

    2014-01-01

    Radiotracer signal analysis and recognition still represents challenges in industrial and environmental applications specially in residence time distribution (RTD) measurement. This paper presents a development for the RTD signal recognition method that is based on power density spectrum (PDS). In this development, the features are extracted from the signals and/or from their higher-orders statistics (HOS) (Bispectrum and Trispectrum) instead of PDS. The HOS are estimated using direct, indirect and parametric estimations. The recognition results are analyzed and compared for different HOS estimation in order to select the best HOS estimation method for the purpose of RTD signal recognition. The artificial neural networks are used for training and testing of the proposed method. The proposed method is tested using RTD signals obtained from the measurements carried out using radiotracer technique. The simulation results show that the parametric estimation of the Trispectrum gives the higher recognition rate and is the most reliable for the RTD signal recognition. (author)

  4. The Motivation Analysis Test: an historical and contemporary evaluation.

    Science.gov (United States)

    Bernard, Larry C; Walsh, R Patricia; Mills, Michael

    2005-04-01

    This is an historical review and contemporary empirical evaluation of the Motivation Analysis Test (MAT), one of the first tests to take a psychometric approach to the assessment of motivation. Reviews were quite positive, but the test is now over 50 years old. Nevertheless, it employs innovations in measurement not widely used in objective measurement then or now: (1) subtests with different formats, (2) disguised items, (3) speeded administration procedures, and (4) ipsative format and scoring procedures. These issues are discussed and a contemporary sample (N = 360) obtained to evaluate the Motivation Analysis Test in light of its innovative characteristics.

  5. Static and Dynamic Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads for Smart Grids

    DEFF Research Database (Denmark)

    Mihet-Popa, Lucian; Groza, V.

    2011-01-01

    of the Smart Grids (SGs). A SG can operate interconnected to the main distribution grid or in islanded mode. This paper presents experimental tests for static and dynamic stability analysis carried out in a dedicated laboratory for research in distributed control and smart grid with a high share of renewable......The distributed energy resources (DER) contains several technologies, such as diesel engines, small wind turbines, photovoltaic inverters, etc. The control of DER components with storage devices and (controllable) loads, such as batteries, capacitors, dump loads, are central to the concept...... energy production. Moreover to point out, on a laboratory scale, the coupling between DR and storage and to effectively compensate wind fluctuations a number of tests have been done. In order to find out the parameters of various types of DER components for dynamic simulation models a number of tests...

  6. A Review of Power Distribution Test Feeders in the United States and the Need for Synthetic Representative Networks

    Directory of Open Access Journals (Sweden)

    Fernando E. Postigo Marcos

    2017-11-01

    Full Text Available Under the increasing penetration of distributed energy resources and new smart network technologies, distribution utilities face new challenges and opportunities to ensure reliable operations, manage service quality, and reduce operational and investment costs. Simultaneously, the research community is developing algorithms for advanced controls and distribution automation that can help to address some of these challenges. However, there is a shortage of realistic test systems that are publically available for development, testing, and evaluation of such new algorithms. Concerns around revealing critical infrastructure details and customer privacy have severely limited the number of actual networks published and that are available for testing. In recent decades, several distribution test feeders and US-featured representative networks have been published, but the scale, complexity, and control data vary widely. This paper presents a first-of-a-kind structured literature review of published distribution test networks with a special emphasis on classifying their main characteristics and identifying the types of studies for which they have been used. This both aids researchers in choosing suitable test networks for their needs and highlights the opportunities and directions for further test system development. In particular, we highlight the need for building large-scale synthetic networks to overcome the identified drawbacks of current distribution test feeders.

  7. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  8. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based "local" methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative "bucket-style" hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  9. PanDA: distributed production and distributed analysis system for ATLAS

    International Nuclear Information System (INIS)

    Maeno, T

    2008-01-01

    A new distributed software system was developed in the fall of 2005 for the ATLAS experiment at the LHC. This system, called PANDA, provides an integrated service architecture with late binding of jobs, maximal automation through layered services, tight binding with ATLAS Distributed Data Management system [1], advanced error discovery and recovery procedures, and other features. In this talk, we will describe the PANDA software system. Special emphasis will be placed on the evolution of PANDA based on one and half year of real experience in carrying out Computer System Commissioning data production [2] for ATLAS. The architecture of PANDA is well suited for the computing needs of the ATLAS experiment, which is expected to be one of the first HEP experiments to operate at the petabyte scale

  10. The Aggregation of Individual Distributive Preferences through the Distributive Liberal Social Contract : Normative Analysis.

    OpenAIRE

    Jean Mercier-Ythier

    2010-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is both Pareto-efficient relative to individual interdependent preferences, and unanimously weak...

  11. EM algorithm for one-shot device testing with competing risks under exponential distribution

    International Nuclear Information System (INIS)

    Balakrishnan, N.; So, H.Y.; Ling, M.H.

    2015-01-01

    This paper provides an extension of the work of Balakrishnan and Ling [1] by introducing a competing risks model into a one-shot device testing analysis under an accelerated life test setting. An Expectation Maximization (EM) algorithm is then developed for the estimation of the model parameters. An extensive Monte Carlo simulation study is carried out to assess the performance of the EM algorithm and then compare the obtained results with the initial estimates obtained by the Inequality Constrained Least Squares (ICLS) method of estimation. Finally, we apply the EM algorithm to a clinical data, ED01, to illustrate the method of inference developed here. - Highlights: • ALT data analysis for one-shot devices with competing risks is considered. • EM algorithm is developed for the determination of the MLEs. • The estimations of lifetime under normal operating conditions are presented. • The EM algorithm improves the convergence rate

  12. A distributed microcomputer-controlled system for data acquisition and power spectral analysis of EEG.

    Science.gov (United States)

    Vo, T D; Dwyer, G; Szeto, H H

    1986-04-01

    A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.

  13. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  14. Alternatives Analysis for the Resumption of Transient Testing Program

    Energy Technology Data Exchange (ETDEWEB)

    Lee Nelson

    2013-11-01

    An alternatives analysis was performed for resumption of transient testing. The analysis considered eleven alternatives – including both US international facilities. A screening process was used to identify two viable alternatives from the original eleven. In addition, the alternatives analysis includes a no action alternative as required by the National Environmental Policy Act (NEPA). The alternatives considered in this analysis included: 1. Restart the Transient Reactor Test Facility (TREAT) 2. Modify the Annular Core Research Reactor (ACRR) which includes construction of a new hot cell and installation of a new hodoscope. 3. No Action

  15. Development and Field Test of a Real-Time Database in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-03-01

    Full Text Available Recently, a distribution management system (DMS that can conduct periodical system analysis and control by mounting various applications programs has been actively developed. In this paper, we summarize the development and demonstration of a database structure that can perform real-time system analysis and control of the Korean smart distribution management system (KSDMS. The developed database structure consists of a common information model (CIM-based off-line database (DB, a physical DB (PDB for DB establishment of the operating server, a real-time DB (RTDB for real-time server operation and remote terminal unit data interconnection, and an application common model (ACM DB for running application programs. The ACM DB for real-time system analysis and control of the application programs was developed by using a parallel table structure and a link list model, thereby providing fast input and output as well as high execution speed of application programs. Furthermore, the ACM DB was configured with hierarchical and non-hierarchical data models to reflect the system models that increase the DB size and operation speed through the reduction of the system, of which elements were unnecessary for analysis and control. The proposed database model was implemented and tested at the Gochaing and Jeju offices using a real system. Through data measurement of the remote terminal units, and through the operation and control of the application programs using the measurement, the performance, speed, and integrity of the proposed database model were validated, thereby demonstrating that this model can be applied to real systems.

  16. The Test for Flow Characteristics of Tubular Fuel Assembly(II) - Experimental results and CFD analysis

    International Nuclear Information System (INIS)

    Park, Jong Hark; Chae, H. T.; Park, C.; Kim, H.

    2006-12-01

    A test facility had been established for the experiment of velocity distribution and pressure drop in a tubular fuel. A basic test had been conducted to examine the performance of the test loop and to verify the accuracy of measurement by pitot-tube. In this report, test results and CFD analysis for the hydraulic characteristics of a tubular fuel, following the previous tests, are described. Coolant velocities in all channels were measured using pitot-tube and the effect of flow rate change on the velocity distribution was also examined. The pressure drop through the tubular fuel was measured for various flow rates in range of 1 kg/s to 21 kg/s to obtain a correlation of pressure drop with variation of flow rate. In addition, a CFD(Computational Fluid Dynamics) analysis was also done to find out the hydraulic characteristics of tubular fuel such as velocity distribution and pressure drop. As the results of CFD analysis can give us a detail insight on coolant flow in the tubular fuel, the CFD method is a very useful tool to understand the flow structure and phenomena induced by fluid flow. The CFX-10, a commercial CFD code, was used in this study. The two results by the experiment and the CFD analysis were investigated and compared with each other. Overall trend of velocity distribution by CFD analysis was somewhat different from that of experiment, but it would be reasonable considering measurement uncertainties. The CFD prediction for pressure drop of a tubular fuel shows a tolerably good agreement with experiment within 8% difference

  17. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Sample path analysis and distributions of boundary crossing times

    CERN Document Server

    Zacks, Shelemyahu

    2017-01-01

    This monograph is focused on the derivations of exact distributions of first boundary crossing times of Poisson processes, compound Poisson processes, and more general renewal processes.  The content is limited to the distributions of first boundary crossing times and their applications to various stochastic models. This book provides the theory and techniques for exact computations of distributions and moments of level crossing times. In addition, these techniques could replace simulations in many cases, thus providing more insight about the phenomenona studied. This book takes a general approach for studying telegraph processes and is based on nearly thirty published papers by the author and collaborators over the past twenty five years.  No prior knowledge of advanced probability is required, making the book widely available to students and researchers in applied probability, operations research, applied physics, and applied mathematics. .

  19. A digital elevation analysis: Spatially distributed flow apportioning algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang-Hyun; Kim, Kyung-Hyun [Pusan National University, Pusan(Korea); Jung, Sun-Hee [Korea Environment Institute, (Korea)

    2001-06-30

    A flow determination algorithm is proposed for the distributed hydrologic model. The advantages of a single flow direction scheme and multiple flow direction schemes are selectively considered to address the drawbacks of existing algorithms. A spatially varied flow apportioning factor is introduced in order to accommodate the accumulated area from upslope cells. The channel initiation threshold area(CIT) concept is expanded and integrated into the spatially distributed flow apportioning algorithm in order to delineate a realistic channel network. An application of a field example suggests that the linearly distributed flow apportioning scheme provides some advantages over existing approaches, such as the relaxation of over-dissipation problems near channel cells, the connectivity feature of river cells, the continuity of saturated areas and the negligence of the optimization of few parameters in existing algorithms. The effects of grid sizes are explored spatially as well as statistically. (author). 28 refs., 7 figs.

  20. Simulation and energy analysis of distributed electric heating system

    Science.gov (United States)

    Yu, Bo; Han, Shenchao; Yang, Yanchun; Liu, Mingyuan

    2018-02-01

    Distributed electric heating system assistssolar heating systemby using air-source heat pump. Air-source heat pump as auxiliary heat sourcecan make up the defects of the conventional solar thermal system can provide a 24 - hour high - efficiency work. It has certain practical value and practical significance to reduce emissions and promote building energy efficiency. Using Polysun software the system is simulated and compared with ordinary electric boiler heating system. The simulation results show that upon energy request, 5844.5kW energy is saved and 3135kg carbon - dioxide emissions are reduced and5844.5 kWhfuel and energy consumption is decreased with distributed electric heating system. Theeffect of conserving energy and reducing emissions using distributed electric heating systemis very obvious.

  1. Influence of parafunctional loading and prosthetic connection on stress distribution: a 3D finite element analysis.

    Science.gov (United States)

    Torcato, Leonardo Bueno; Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; Falcón-Antenucci, Rosse Mary; Santiago Júnior, Joel Ferreira; de Faria Almeida, Daniel Augusto

    2015-11-01

    Clinicians should consider parafunctional occlusal load when planning treatment. Prosthetic connections can reduce the stress distribution on an implant-supported prosthesis. The purpose of this 3-dimensional finite element study was to assess the influence of parafunctional loading and prosthetic connections on stress distribution. Computer-aided design software was used to construct 3 models. Each model was composed of a bone and an implant (external hexagon, internal hexagon, or Morse taper) with a crown. Finite element analysis software was used to generate the finite element mesh and establish the loading and boundary conditions. A normal force (200-N axial load and 100-N oblique load) and parafunctional force (1000-N axial and 500-N oblique load) were applied. Results were visualized as the maximum principal stress. Three-way analysis of variance and Tukey test were performed, and the percentage of contribution of each variable to the stress concentration was calculated from sum-of squares-analysis. Stress was concentrated around the implant at the cortical bone, and models with the external hexagonal implant showed the highest stresses (PProsthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  2. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  3. Full scale lightning surge tests of distribution transformers and secondary systems

    International Nuclear Information System (INIS)

    Goedde, G.L.; Dugan, R.C. Sr.; Rowe, L.D.

    1992-01-01

    This paper reports that low-side surges are known to cause failures of distribution transformers. They also subject load devices to overvoltages. A full-scale model of a residential service has been set up in a laboratory and subjected to impulses approximating lightning strokes. The tests were made to determine the impulse characteristics of the secondary system and to test the validity of previous analyses. Among the variables investigated were stroke location, the balance of the surges in the service cable, and the effectiveness of arrester protection. Low-side surges were found to consist of two basic components: the natural frequency of the system and the inductive response of the system to the stoke current. The latter component is responsible for transformer failures while the former may be responsible for discharge spots often found around secondary bushings. Arresters at the service entrance are effective in diverting most of the energy from a lightning strike, but may not protect sensitive loads. Additional local protection is also needed. The tests affirmed previous simulations and uncovered additional phenomena as well

  4. Interactive microbial distribution analysis using BioAtlas

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    2017-01-01

    body maps and (iii) user-defined maps. It further allows for (iv) uploading of own sample data, which can be placed on existing maps to (v) browse the distribution of the associated taxonomies. Finally, BioAtlas enables users to (vi) contribute custom maps (e.g. for plants or animals) and to map...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  5. Core Flow Distribution from Coupled Supercritical Water Reactor Analysis

    Directory of Open Access Journals (Sweden)

    Po Hu

    2014-01-01

    Full Text Available This paper introduces an extended code package PARCS/RELAP5 to analyze steady state of SCWR US reference design. An 8 × 8 quarter core model in PARCS and a reactor core model in RELAP5 are used to study the core flow distribution under various steady state conditions. The possibility of moderator flow reversal is found in some hot moderator channels. Different moderator flow orifice strategies, both uniform across the core and nonuniform based on the power distribution, are explored with the goal of preventing the reversal.

  6. Energy efficiency analysis of reconfigured distribution system for practical loads

    Directory of Open Access Journals (Sweden)

    Pawan Kumar

    2016-09-01

    Full Text Available In deregulated rate structure, the performance evaluation of distribution system for energy efficiency includes; loss minimization, improved power quality, loadability limit, reliability and availability of supply. Energy efficiency changes with the variation in loading pattern and the load behaviour. Further, the nature of load at each node is not explicitly of any one type rather their characteristics depend upon the node voltages. In most cases, load is assumed to be constant power (real and reactive. In this paper voltage dependent practical loads are represented with composite load model and the energy efficiency performance of distribution system for practical loads is evaluated in different configurations of 33-node system.

  7. Advances in the analysis of pressure interference tests

    Energy Technology Data Exchange (ETDEWEB)

    Martinez R, N. [Petroleos Mexicanos, PEMEX, Mexico City (Mexico); Samaniego V, F. [Univ. Nacional Autonoma de Mexico (Mexico)

    2010-12-15

    This paper presented an extension for radial, linear, and spherical flow conditions of the El-Khatib method for analyzing pressure interference tests through utilization of the pressure derivative. Conventional analysis of interference tests considers only radial flow, but some reservoirs have physical field conditions in which linear or spherical flow conditions prevail. The INTERFERAN system, a friendly computer code for the automatic analysis of pressure interference tests, was also discussed and demonstrated by way of 2 field cases. INTERFERAN relies on the principle of superposition in time and space to interpret a test of several wells with variable histories of production or injection or both. The first field case addressed interference tests conducted in the naturally fractured geothermal field of Klamath Falls, and the second field case was conducted in a river-formed bed in which linear flow conditions are dominant. The analysis was deemed to be reliable. 13 refs., 1 tab., 7 figs.

  8. Minnesota urban partnership agreement national evaluation : content analysis test plan.

    Science.gov (United States)

    2009-11-17

    This report presents the content analysis test plan for the Minnesota Urban Partnership Agreement (UPA) under the United States Department of Transportation (U.S. DOT) UPA Program. The Minnesota UPA projects focus on reducing congestion by employing ...

  9. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  10. Scaling analysis for the OSU AP600 test facility (APEX)

    International Nuclear Information System (INIS)

    Reyes, J.N.

    1998-01-01

    In this paper, the authors summarize the key aspects of a state-of-the-art scaling analysis (Reyes et al. (1995)) performed to establish the facility design and test conditions for the advanced plant experiment (APEX) at Oregon State University (OSU). This scaling analysis represents the first, and most comprehensive, application of the hierarchical two-tiered scaling (H2TS) methodology (Zuber (1991)) in the design of an integral system test facility. The APEX test facility, designed and constructed on the basis of this scaling analysis, is the most accurate geometric representation of a Westinghouse AP600 nuclear steam supply system. The OSU APEX test facility has served to develop an essential component of the integral system database used to assess the AP600 thermal hydraulic safety analysis computer codes. (orig.)

  11. Nonlinear Analysis and Preliminary Testing Results of a Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.; Wu, Hsi-Yung T.

    2015-01-01

    A large test article was recently designed, analyzed, fabricated, and successfully tested up to the representative design ultimate loads to demonstrate that stiffened composite panels with through-the-thickness reinforcement are a viable option for the next generation large transport category aircraft, including non-conventional configurations such as the hybrid wing body. This paper focuses on finite element analysis and test data correlation of the hybrid wing body center section test article under mechanical, pressure and combined load conditions. Good agreement between predictive nonlinear finite element analysis and test data is found. Results indicate that a geometrically nonlinear analysis is needed to accurately capture the behavior of the non-circular pressurized and highly-stressed structure when the design approach permits local buckling.

  12. Analysis of DCI cask drop test onto reinforced concrete pad

    International Nuclear Information System (INIS)

    Ito, C.; Kato, Y.; Hattori, S.; Shirai, K.; Misumi, M.; Ozaki, S.

    1993-01-01

    In a cask-storage facility, a cask may be subjected to an impact load as a result of a free drop onto the floor because of cask mishandling. We performed drop tests of casks onto a reinforced concrete (RC) slab representing the floor of a facility as well as simulation analysis [Kato et al]. This paper describes the details of the FEM analysis and calculated results and compares them with the drop test results. (J.P.N.)

  13. Preliminary test results and CFD analysis for moderator circulation test at Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, H.T. [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of); Im, S.H.; Sung, H.J. [Korea Advanced Inst. of Science and Tech., Daejeon (Korea, Republic of); Seo, H.; Bang, I.C. [Ulsan National Inst. of Science and Tech., Ulsan (Korea, Republic of)

    2014-07-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  14. Preliminary test results and CFD analysis for moderator circulation test at Korea

    International Nuclear Information System (INIS)

    Kim, H.T.; Im, S.H.; Sung, H.J.; Seo, H.; Bang, I.C.

    2014-01-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  15. Modeling the isotopic evolution of snowpack and snowmelt: Testing a spatially distributed parsimonious approach.

    Science.gov (United States)

    Ala-Aho, Pertti; Tetzlaff, Doerthe; McNamara, James P; Laudon, Hjalmar; Kormos, Patrick; Soulsby, Chris

    2017-07-01

    Use of stable water isotopes has become increasingly popular in quantifying water flow paths and travel times in hydrological systems using tracer-aided modeling. In snow-influenced catchments, snowmelt produces a traceable isotopic signal, which differs from original snowfall isotopic composition because of isotopic fractionation in the snowpack. These fractionation processes in snow are relatively well understood, but representing their spatiotemporal variability in tracer-aided studies remains a challenge. We present a novel, parsimonious modeling method to account for the snowpack isotope fractionation and estimate isotope ratios in snowmelt water in a fully spatially distributed manner. Our model introduces two calibration parameters that alone account for the isotopic fractionation caused by sublimation from interception and ground snow storage, and snowmelt fractionation progressively enriching the snowmelt runoff. The isotope routines are linked to a generic process-based snow interception-accumulation-melt model facilitating simulation of spatially distributed snowmelt runoff. We use a synthetic modeling experiment to demonstrate the functionality of the model algorithms in different landscape locations and under different canopy characteristics. We also provide a proof-of-concept model test and successfully reproduce isotopic ratios in snowmelt runoff sampled with snowmelt lysimeters in two long-term experimental catchment with contrasting winter conditions. To our knowledge, the method is the first such tool to allow estimation of the spatially distributed nature of isotopic fractionation in snowpacks and the resulting isotope ratios in snowmelt runoff. The method can thus provide a useful tool for tracer-aided modeling to better understand the integrated nature of flow, mixing, and transport processes in snow-influenced catchments.

  16. Analysis of temperature distribution in a heat conducting fiber with ...

    African Journals Online (AJOL)

    The temperature distribution in a heat conducting fiber is computed using the Galerkin Finite Element Method in the present study. The weak form of the governing differential equation is obtained and nodal temperatures for linear and quadratic interpolation functions for different mesh densities are calculated for Neumann ...

  17. Polybutadiene latex particle size distribution analysis utilizing a disk centrifuge

    NARCIS (Netherlands)

    Verdurmen, E.M.F.J.; Albers, J.G.; German, A.L.

    1994-01-01

    Polybutadiene (I) latexes prepd. by emulsifier-free emulsion polymn. and having particle diam. 50-300 nm for both unimodal and bimodal particles size distributions were analyzed by the line-start (LIST) method in a Brookhaven disk centrifuge photosedimentometer. A special spin fluid was designed to

  18. Resonance analysis in parallel voltage-controlled Distributed Generation inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2013-01-01

    Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops...

  19. Analysis of the Relationship between Shared Leadership and Distributed Leadership

    Science.gov (United States)

    Goksoy, Suleyman

    2016-01-01

    Problem Statement: The current study's purpose is: First, to examine the relationship between shared leadership and distributed leadership, which, despite having many similar aspects in theory and practice, are defined as separate concepts. Second, to compare the two approaches and dissipate the theoretical contradictions. In this sense, the main…

  20. Comparative Analysis of Possible Designs for Flexible Distribution System Operation

    DEFF Research Database (Denmark)

    Lin, Jeremy; Knezovic, Katarina

    2016-01-01

    for achieving the most efficient utilization of these resources while meeting the forecasted load. In this paper, we present possible system design frameworks proposed for flexible distribution system operation. Critical evaluations and comparison of these models are made based on a number of key attributes...