WorldWideScience

Sample records for precursor distribution analysis

  1. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  2. MID Max: LC–MS/MS Method for Measuring the Precursor and Product Mass Isotopomer Distributions of Metabolic Intermediates and Cofactors for Metabolic Flux Analysis Applications

    DEFF Research Database (Denmark)

    McCloskey, Douglas; Young, Jamey D.; Xu, Sibei

    2016-01-01

    The analytical challenges to acquire accurate isotopic data of intracellular metabolic intermediates for stationary, nonstationary, and dynamic metabolic flux analysis (MFA) are numerous. This work presents MID Max, a novel LC–MS/MS workflow, acquisition, and isotopomer deconvolution method for MFA...... that takes advantage of additional scan types that maximizes the number of mass isotopomer distributions (MIDs) that can be acquired in a given experiment. The analytical method was found to measure the MIDs of 97 metabolites, corresponding to 74 unique metabolite-fragment pairs (32 precursor spectra and 42...

  3. Probabilistic precursor analysis - an application of PSA

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Gopika, V.; Sanyasi Rao, V.V.S.; Vaze, K.K.

    2011-01-01

    Incidents are inevitably part of the operational life of any complex industrial facility, and it is hard to predict how various contributing factors combine to cause the outcome. However, it should be possible to detect the existence of latent conditions that, together with the triggering failure(s), result in abnormal events. These incidents are called precursors. Precursor study, by definition, focuses on how a particular event might have adversely developed. This paper focuses on the events which can be analyzed to assess their potential to develop into core damage situation and looks into extending Probabilistic Safety Assessment techniques to precursor studies and explains the benefits through a typical case study. A preliminary probabilistic precursor analysis has been carried out for a typical NPP. The major advantages of this approach are the strong potential for augmenting event analysis which is currently carried out purely on deterministic basis. (author)

  4. Comparison exercise of probabilistic precursor analysis

    International Nuclear Information System (INIS)

    Fauchille, V.; Babst, S.

    2004-01-01

    From 2000 up to 2003, a comparison exercise concerning accident precursor programs was performed by IRSN, GRS, and NUPEC (Japan). The objective of this exercise was to compare the methodologies used to quantify conditional core damage probability related to incidents which can be considered as accident precursors. This exercise provided interesting results concerning the interpretation of such events. Generally, the participants identified similar scenarios of potential degradation. However, for several dominant sequences, differences in the results were noticed. The differences can be attributed to variations in the plant design, the strategy of management and in the methodological approach. For many reasons, comparison of human reliability analysis was difficult and perhaps another exercise in the future could provide more information about this subject. On the other hand, interesting outcomes have been obtained from the quantification of both common cause failures and potential common cause failures. (orig.)

  5. Operational experience feedback with precursor analysis

    International Nuclear Information System (INIS)

    Koncar, M.; Ferjancic, M.; Muehleisen, A.; Vojnovic, D.

    2003-01-01

    Experience of practical operation is a valuable source of information for improving the safety and reliability of nuclear power plants. Operational experience feedback (Olef) system manages this aspect of NPP operation. The traditional ways of investigating operational events, such as the root cause analysis (RCA), are predominantly qualitative. RCA as a part of the Olef system provides technical guidance and management expectations in the conduct of assessing the root cause to prevent recurrence, covering the following areas: conditions preceding the event, sequence of events, equipment performance and system response, human performance considerations, equipment failures, precursors to the event, plant response and follow-up, radiological considerations, regulatory process considerations and safety significance. The root cause of event is recognized when there is no known answer on question 'why has it happened?' regarding relevant condition that may have affected the event. At that point the Olef is proceeding by actions taken in response to events, utilization, dissemination and exchange of operating experience information and at the end reviewing the effectiveness of the Olef. Analysis of the event and the selection of recommended corrective/preventive actions for implementation and prioritization can be enhanced by taking into account the information and insights derived from Pasa-based analysis. A Pasa based method, called probabilistic precursor event analysis (PPE A) provides a complement to the RCA approach by focusing on how an event might have developed adversely, and implies the mapping of an operational event on a probabilistic risk model of the plant in order to obtain a quantitative assessment of the safety significance of the event PSA based event analysis provides, due to its quantitative nature, appropriate prioritization of corrective actions. PPEA defines requirements for PSA model and code, identifies input requirements and elaborates following

  6. Statistical study of spatio-temporal distribution of precursor solar flares associated with major flares

    Science.gov (United States)

    Gyenge, N.; Ballai, I.; Baranyi, T.

    2016-07-01

    The aim of the present investigation is to study the spatio-temporal distribution of precursor flares during the 24 h interval preceding M- and X-class major flares and the evolution of follower flares. Information on associated (precursor and follower) flares is provided by Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI). Flare list, while the major flares are observed by the Geostationary Operational Environmental Satellite (GOES) system satellites between 2002 and 2014. There are distinct evolutionary differences between the spatio-temporal distributions of associated flares in about one-day period depending on the type of the main flare. The spatial distribution was characterized by the normalized frequency distribution of the quantity δ (the distance between the major flare and its precursor flare normalized by the sunspot group diameter) in four 6 h time intervals before the major event. The precursors of X-class flares have a double-peaked spatial distribution for more than half a day prior to the major flare, but it changes to a lognormal-like distribution roughly 6 h prior to the event. The precursors of M-class flares show lognormal-like distribution in each 6 h subinterval. The most frequent sites of the precursors in the active region are within a distance of about 0.1 diameter of sunspot group from the site of the major flare in each case. Our investigation shows that the build-up of energy is more effective than the release of energy because of precursors.

  7. Cyclotide Evolution: Insights from the Analyses of Their Precursor Sequences, Structures and Distribution in Violets (Viola

    Directory of Open Access Journals (Sweden)

    Sungkyu Park

    2017-12-01

    Full Text Available Cyclotides are a family of plant proteins that are characterized by a cyclic backbone and a knotted disulfide topology. Their cyclic cystine knot (CCK motif makes them exceptionally resistant to thermal, chemical, and enzymatic degradation. By disrupting cell membranes, the cyclotides function as host defense peptides by exhibiting insecticidal, anthelmintic, antifouling, and molluscicidal activities. In this work, we provide the first insight into the evolution of this family of plant proteins by studying the Violaceae, in particular species of the genus Viola. We discovered 157 novel precursor sequences by the transcriptomic analysis of six Viola species: V. albida var. takahashii, V. mandshurica, V. orientalis, V. verecunda, V. acuminata, and V. canadensis. By combining these precursor sequences with the phylogenetic classification of Viola, we infer the distribution of cyclotides across 63% of the species in the genus (i.e., ~380 species. Using full precursor sequences from transcriptomes, we show an evolutionary link to the structural diversity of the cyclotides, and further classify the cyclotides by sequence signatures from the non-cyclotide domain. Also, transcriptomes were compared to cyclotide expression on a peptide level determined using liquid chromatography-mass spectrometry. Furthermore, the novel cyclotides discovered were associated with the emergence of new biological functions.

  8. Progress in characterizing submonolayer island growth: Capture-zone distributions, growth exponents, & hot precursors

    Science.gov (United States)

    Einstein, Theodore L.; Pimpinelli, Alberto; González, Diego Luis; Morales-Cifuentes, Josue R.

    2015-09-01

    In studies of epitaxial growth, analysis of the distribution of the areas of capture zones (i.e. proximity polygons or Voronoi tessellations with respect to island centers) is often the best way to extract the critical nucleus size i. For non-random nucleation the normalized areas s of these Voronoi cells are well described by the generalized Wigner distribution (GWD) Pβ(s) = asβ exp(-bs2), particularly in the central region 0.5 < s < 2 where data are least noisy. Extensive Monte Carlo simulations reveal inadequacies of our earlier mean field analysis, suggesting β = i + 2 for diffusion-limited aggregation (DLA). Since simulations generate orders of magnitude more data than experiments, they permit close examination of the tails of the distribution, which differ from the simple GWD form. One refinement is based on a fragmentation model. We also compare island-size distributions. We compare analysis by island-size distribution and by scaling of island density with flux. Modifications appear for attach-limited aggregation (ALA). We focus on the experimental system para-hexaphenyl on amorphous mica, comparing the results of the three analysis techniques and reconciling their results via a novel model of hot precursors based on rate equations, pointing out the existence of intermediate scaling regimes between DLA and ALA.

  9. Analysis of dependent failures in the ORNL precursor study

    International Nuclear Information System (INIS)

    Ballard, G.M.

    1985-01-01

    The study of dependent failures (or common cause/mode failures) in the safety assessment of potentially hazardous plant is one of the significant areas of uncertainty in performing probabilistic safety studies. One major reason for this uncertainty is that data on dependent failures is apparently not readily available in sufficient quantity to assist in the development and validation of models. The incident reports that were compiled for the ORNL study on Precursors to Severe Core Damage Accidents (NUREG/CR-2497) provide an opportunity to look at the importance of dependent failures in the most significant incidents of recent reactor operations, to look at the success of probabilistic risk assessment (PRA) methods in accounting for the contribution of dependent failures, and to look at the dependent failure incidents with the aim of identifying the most significant problem areas. In this paper an analysis has been made of the incidents compiled in NUREG/CR-2497 and events involving multiple failures which were not independent have been identified. From this analysis it is clear that dependent failures are a very significant contributor to the precursor incidents. The method of enumeration of accident frequency used in NUREG-2497 can be shown to take account of dependent failures and this may be a significant factor contributing to the apparent difference between the precursor accident frequency and typical PRA frequencies

  10. Molecular analysis of precursor lesions in familial pancreatic cancer.

    Directory of Open Access Journals (Sweden)

    Tatjana Crnogorac-Jurcevic

    Full Text Available With less than a 5% survival rate pancreatic adenocarcinoma (PDAC is almost uniformly lethal. In order to make a significant impact on survival of patients with this malignancy, it is necessary to diagnose the disease early, when curative surgery is still possible. Detailed knowledge of the natural history of the disease and molecular events leading to its progression is therefore critical.We have analysed the precursor lesions, PanINs, from prophylactic pancreatectomy specimens of patients from four different kindreds with high risk of familial pancreatic cancer who were treated for histologically proven PanIN-2/3. Thus, the material was procured before pancreatic cancer has developed, rather than from PanINs in a tissue field that already contains cancer. Genome-wide transcriptional profiling using such unique specimens was performed. Bulk frozen sections displaying the most extensive but not microdissected PanIN-2/3 lesions were used in order to obtain the holistic view of both the precursor lesions and their microenvironment. A panel of 76 commonly dysregulated genes that underlie neoplastic progression from normal pancreas to PanINs and PDAC were identified. In addition to shared genes some differences between the PanINs of individual families as well as between the PanINs and PDACs were also seen. This was particularly pronounced in the stromal and immune responses.Our comprehensive analysis of precursor lesions without the invasive component provides the definitive molecular proof that PanIN lesions beget cancer from a molecular standpoint. We demonstrate the need for accumulation of transcriptomic changes during the progression of PanIN to PDAC, both in the epithelium and in the surrounding stroma. An identified 76-gene signature of PDAC progression presents a rich candidate pool for the development of early diagnostic and/or surveillance markers as well as potential novel preventive/therapeutic targets for both familial and sporadic

  11. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    Science.gov (United States)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  12. Proteomic analysis of osteogenic differentiation of dental follicle precursor cells

    DEFF Research Database (Denmark)

    Morsczeck, Christian; Petersen, Jørgen; Völlner, Florian

    2009-01-01

    of differentiation. In the present study we applied 2-DE combined with capillary-LC-MS/MS analysis to profile differentially regulated proteins upon differentiation of dental follicle precursor cells (DFPCs). Out of 115 differentially regulated proteins, glutamine synthetase, lysosomal proteinase cathepsin B....... The bioinformatic analyses suggest that proteins associated with cell cycle progression and protein metabolism were down-regulated and proteins involved in catabolism, cell motility and biological quality were up-regulated. These results display the general physiological state of DFPCs before and after osteogenic...... proteins, plastin 3 T-isoform, beta-actin, superoxide dismutases, and transgelin were found to be highly up-regulated, whereas cofilin-1, pro-alpha 1 collagen, destrin, prolyl 4-hydrolase and dihydrolipoamide dehydrogenase were found to be highly down-regulated. The group of up-regulated proteins...

  13. Learning from Trending, Precursor Analysis, and System Failures

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. W. [Idaho National Laboratory, Idaho Falls, ID (United States); Duffey, R. B. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    Models of reliability growth relate current system unreliability to currently accumulated experience. But “experience” comes in different forms. Looking back after a major accident, one is sometimes able to identify previous events or measurable performance trends that were, in some sense, signaling the potential for that major accident: potential that could have been recognized and acted upon, but was not recognized until the accident occurred. This could be a previously unrecognized cause of accidents, or underestimation of the likelihood that a recognized potential cause would actually operate. Despite improvements in the state of practice of modeling of risk and reliability, operational experience still has a great deal to teach us, and work has been going on in several industries to try to do a better job of learning from experience before major accidents occur. It is not enough to say that we should review operating experience; there is too much “experience” for such general advice to be considered practical. The paper discusses the following: 1. The challenge of deciding what to focus on in analysis of operating experience. 2. Comparing what different models of learning and reliability growth imply about trending and precursor analysis.

  14. NASA Accident Precursor Analysis Handbook, Version 1.0

    Science.gov (United States)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

  15. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  16. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  17. Progressing from Identification and Functional Analysis of Precursor Behavior to Treatment of Self-Injurious Behavior

    Science.gov (United States)

    Dracobly, Joseph D.; Smith, Richard G.

    2012-01-01

    This multiple-study experiment evaluated the utility of assessing and treating severe self-injurious behavior (SIB) based on the outcomes of a functional analysis of precursor behavior. In Study 1, a precursor to SIB was identified using descriptive assessment and conditional probability analyses. In Study 2, a functional analysis of precursor…

  18. Product Distribution from Precursor Bite Angle Variation in Multitopic Alkyne Metathesis: Evidence for a Putative Kinetic Bottleneck.

    Science.gov (United States)

    Moneypenny, Timothy P; Yang, Anna; Walter, Nathan P; Woods, Toby J; Gray, Danielle L; Zhang, Yang; Moore, Jeffrey S

    2018-05-02

    In the dynamic synthesis of covalent organic frameworks and molecular cages, the typical synthetic approach involves heuristic methods of discovery. While this approach has yielded many remarkable products, the ability to predict the structural outcome of subjecting a multitopic precursor to dynamic covalent chemistry (DCC) remains a challenge in the field. The synthesis of covalent organic cages is a prime example of this phenomenon, where precursors designed with the intention of affording a specific product may deviate dramatically when the DCC synthesis is attempted. As such, rational design principles are needed to accelerate discovery in cage synthesis using DCC. Herein, we test the hypothesis that precursor bite angle contributes significantly to the energy landscape and product distribution in multitopic alkyne metathesis (AM). By subjecting a series of precursors with varying bite angles to AM, we experimentally demonstrate that the product distribution, and convergence toward product formation, is strongly dependent on this geometric attribute. Surprisingly, we discovered that precursors with the ideal bite angle (60°) do not afford the most efficient pathway to the product. The systematic study reported here illustrates how seemingly minor adjustments in precursor geometry greatly affect the outcome of DCC systems. This research illustrates the importance of fine-tuning precursor geometric parameters in order to successfully realize desirable targets.

  19. Processing, distribution, and function of VGF, a neuronal and endocrine peptide precursor.

    Science.gov (United States)

    Levi, Andrea; Ferri, Gian-Luca; Watson, Elizabeth; Possenti, Roberta; Salton, Stephen R J

    2004-08-01

    1. The vgf gene encodes a neuropeptide precursor with a restricted pattern of expression that is limited to a subset of neurons in the central and peripheral nervous systems and to specific populations of endocrine cells in the adenohypophysis, adrenal medulla, gastrointestinal tract, and pancreas. In responsive neurons, vgf transcription is upregulated by neurotrophins. the basis for the original identification of VGF as nerve growth factor- (NGF) inducible in PC12 cells (A. Levi, J. D. Eldridge, and B. M. Paterson, Science 229:393-395, 1985). 2. In this review, we shall summarize data concerning the transcriptional regulation of vgf in vitro, the structural organization of the vgf promoter as well as the transcription factors which regulate its activity. 3. On the basis of in situ hybridization and immunohistochemical studies, the in vivo tissue-specific expression of VGF during differentiation and in the adult will be summarized. 4. Parallel biochemical data will be reviewed, addressing the proteolytical processing of the pro-VGF precursor within the secretory compartment of neuroendocrine cells. 5. Finally, analysis of the phenotype of VGF knockout mice will be discussed, implying a nonredundant role of VGF products in the regulation of energy storage and expenditure.

  20. Helper T lymphocyte precursor frequency analysis in alloreactivity detection

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Vitek, A.

    1998-01-01

    The utility of IL-2 secreting helper T lymphocyte precursors (HTLp) frequency testing has been evaluated for detecting alloreactivity. The frequency of HTLp was approached by limiting dilution assay. High HTLp frequency was detected in 20 out of 30 HLA matched unrelated pairs (67%). The comparison of HTLp and CTLp (cytotoxic T lymphocyte precursors) frequencies in HLA matched unrelated pairs showed that the two examinations are not fully alternative in detecting alloreactivity. This could suggest the utility of combined testing of both HTLp and CTLp frequencies for alloreactivity assessment. In contrast, five positive HTLp values were only found among 28 HLA genotypic identical siblings (18%). Previous CTLp limiting dilution studies showed very low or undetectable CTLp frequency results in that group. For that, HTLp assay remains to be the only cellular in vitro technique detecting alloreactivity in these combinations. (authors)

  1. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    Science.gov (United States)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  2. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  3. DNA precursor compartmentation in mammalian cells: distribution and rates of equilibration between nucleus and cytoplasm

    International Nuclear Information System (INIS)

    Leeds, J.M.

    1986-01-01

    A rapid nuclear isolation technique was adapted in order to examine the question of DNA precursor compartmentation in mammalian cells. By using this method a reproducible proportion of the cellular nucleotides remained associated with the isolated nuclei. Examination, at several different cell densities, of exponentially growing HeLa cells showed that the nuclei contained a constant but distinct proportion of each dNTP. The nuclear dATP and dTTP concentrations were equal at all densities examined even though the dTTP pool was 150% of the dATP whole-cell pool. The nuclear portion of the whole-cell pools was roughly equal to the volume occupied by the nucleus. The nuclear-cytoplasmic dNTP pool distribution did not change throughout the cell cycle of synchronized Chinese hamster ovary (CHO) cells. The rates at which either radiolabeled cytidine or deoxycytidine equilibrated with the nuclear and whole-cell dCTP pools of G1 and S phase CHO cells were compared. Experiments comparing the labeling kinetics of 3 H-thymidine in G1, S phase, and exponentially growing cells revealed that the S phase dTTP pool equilibrated with exogenously added thymidine faster than the G1 phase pool. The rate of equilibration in exponentially growing cells appeared to be a combination of that seen in G1 and S phases. A linear rate of 3 H-thymidine incorporation into DNA occurred at the same rate in S phase and exponentially growing cells

  4. Distribution of ozone and its precursors over Bay of Bengal during winter 2009: role of meteorology

    Directory of Open Access Journals (Sweden)

    L. M. David

    2011-09-01

    Full Text Available Measurements of ozone and NO2 were carried out in the marine environment of the Bay of Bengal (BoB during the winter months, December 2008–January 2009, as part of the second Integrated Campaign for Aerosols, gases and Radiation Budget conducted under the Geosphere Biosphere Programme of the Indian Space Research Organization. The ozone mixing ratio was found to be high in the head and the southeast BoB with a mean value of 61 ± 7 ppb and 53 ± 6 ppb, respectively. The mixing ratios of NO2 and CO were also relatively high in these regions. The spatial patterns were examined in the light of airflow patterns, air mass back trajectories and other meteorological conditions and satellite retrieved maps of tropospheric ozone, NO2, CO, and fire count in and around the region. The distribution of these gases was strongly associated with the transport from the adjoining land mass. The anthropogenic activities and forest fires/biomass burning over the Indo Gangetic Plains and other East Asian regions contribute to ozone and its precursors over the BoB. Similarity in the spatial pattern suggests that their source regions could be more or less the same. Most of the diurnal patterns showed decrease of the ozone mixing ratio during noon/afternoon followed by a nighttime increase and a morning high. Over this oceanic region, photochemical production of ozone involving NO2 was not very active. Water vapour played a major role in controlling the variation of ozone. An attempt is made to simulate ozone level over the north and south BoB using the photochemical box model (NCAR-MM. The present observed features were compared with those measured during the earlier cruises conducted in different seasons.

  5. Reproducibility of aluminum foam properties: Effect of precursor distribution on the structural anisotropy and the collapse stress and its dispersion

    International Nuclear Information System (INIS)

    Nosko, M.; Simancik, F.; Florek, R.

    2010-01-01

    The porous structure of aluminum foam manufactured through the foaming of precursors containing blowing agent is stochastic in nature, usually with a random distribution of pores of different size and shape, creating difficulties in the modeling and prediction of foam properties. In this study, the effect of the initial location of the precursor material in the mold on the foam structure and compression behavior was investigated. Structural characterization showed that the porosity distribution, surface skin thickness and pore orientation was affected by the location of the precursors in the mold and by the extrusion direction of the precursors. Moreover, compression tests demonstrated a significant effect of the structural anisotropy on the collapse stress and its dispersion. The collapse stress of the foam increased if the loading was performed parallel to the thicker surface skin or parallel to the preferential pore orientation, leading to a 20% difference in collapse stress. The dispersion of the collapse stress could be significantly decreased if the loading was performed with regard to the structural anisotropy.

  6. Accident sequence precursor analysis level 2/3 model development

    International Nuclear Information System (INIS)

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-01-01

    The US Nuclear Regulatory Commission's Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models

  7. Homotopy analysis solutions of point kinetics equations with one delayed precursor group

    International Nuclear Information System (INIS)

    Zhu Qian; Luo Lei; Chen Zhiyun; Li Haofeng

    2010-01-01

    Homotopy analysis method is proposed to obtain series solutions of nonlinear differential equations. Homotopy analysis method was applied for the point kinetics equations with one delayed precursor group. Analytic solutions were obtained using homotopy analysis method, and the algorithm was analysed. The results show that the algorithm computation time and precision agree with the engineering requirements. (authors)

  8. Quantitative analysis of indigo and indigo precursors in leaves of Isatis spp. and Polygonum tinctorium.

    Science.gov (United States)

    Gilbert, Kerry G; Maule, Hamish G; Rudolph, Bernd; Lewis, Mervyn; Vandenburg, Harold; Sales, Ester; Tozzi, Sabrina; Cooke, David T

    2004-01-01

    Analysis of extracts from two woad species (Isatis tinctoria and Isatis indigotica) and Polygonum tinctorium revealed that only one indigo precursor (indican) was present in Polygonum, but two precursors were found in Isatis spp. This was done using high performance liquid chromatography (HPLC), coupled to an evaporative light scattering detector (ELSD). In Isatis spp., the indigo precursors indican and a fraction representing isatan B were identified. The proportion of indican and isatan B was different between the two Isatis spp. tested. For the first time, it was possible to quantify the precursors in woad plant species, and the results were found to be in good agreement with those made from total indigo quantification using two different spectrophotometric methods or a derivatization technique.

  9. Retrospective analysis for detecting seismic precursors in groundwater argon content

    Directory of Open Access Journals (Sweden)

    P. F. Biagi

    2004-01-01

    interpretation of the Kamchatkian anomalies as precursors.

  10. Effect of precursor solutions stirring on deep level defects concentration and spatial distribution in low temperature aqueous chemical synthesis of zinc oxide nanorods

    Energy Technology Data Exchange (ETDEWEB)

    Alnoor, Hatim, E-mail: hatim.alnoor@liu.se; Chey, Chan Oeurn; Pozina, Galia; Willander, Magnus; Nur, Omer [Department of Science and Technology (ITN), Campus Norrköping, Linköping University, SE-601 74 Norrköping (Sweden); Liu, Xianjie; Khranovskyy, Volodymyr [Department of Physics, Chemistry and Biology (IFM), Linköping University, SE-583 81 Linköping (Sweden)

    2015-08-15

    Hexagonal c-axis oriented zinc oxide (ZnO) nanorods (NRs) with 120-300 nm diameters are synthesized via the low temperature aqueous chemical route at 80 °C on silver-coated glass substrates. The influence of varying the precursor solutions stirring durations on the concentration and spatial distributions of deep level defects in ZnO NRs is investigated. Room temperature micro-photoluminesnce (μ-PL) spectra were collected for all samples. Cathodoluminescence (CL) spectra of the as-synthesized NRs reveal a significant change in the intensity ratio of the near band edge emission (NBE) to the deep-level emission (DLE) peaks with increasing stirring durations. This is attributed to the variation in the concentration of the oxygen-deficiency with increasing stirring durations as suggested from the X-ray photoelectron spectroscopy analysis. Spatially resolved CL spectra taken along individual NRs revealed that stirring the precursor solutions for relatively short duration (1-3 h), which likely induced high super saturation under thermodynamic equilibrium during the synthesis process, is observed to favor the formation of point defects moving towards the tip of the NRs. In contrary, stirring for longer duration (5-15 h) will induce low super saturation favoring the formation of point defects located at the bottom of the NRs. These findings demonstrate that it is possible to control the concentration and spatial distribution of deep level defects in ZnO NRs by varying the stirring durations of the precursor solutions.

  11. Thermal analysis methods in the characterization of photocatalytic titania precursors

    Czech Academy of Sciences Publication Activity Database

    Pulišová, Petra; Večerníková, Eva; Maříková, Monika; Balek, V.; Boháček, Jaroslav; Šubrt, Jan

    2012-01-01

    Roč. 108, č. 2 (2012), s. 489-492 ISSN 1388-6150 R&D Projects: GA MŠk 1M0577 Institutional research plan: CEZ:AV0Z40320502 Keywords : differential thermal analysis * thermogravimetry * emanation thermal analysis * titanium dioxide * photocatalyst Subject RIV: CA - Inorganic Chemistry Impact factor: 1.982, year: 2012

  12. Distribution of sulfur aerosol precursors in the SPCZ released by continuous volcanic degassing at Ambrym, Vanuatu

    Science.gov (United States)

    Lefèvre, Jérôme; Menkes, Christophe; Bani, Philipson; Marchesiello, Patrick; Curci, Gabriele; Grell, Georg A.; Frouin, Robert

    2016-08-01

    The Melanesian Volcanic Arc (MVA) emits about 12 kT d- 1 of sulfur dioxide (SO2) to the atmosphere from continuous passive (non-explosive) volcanic degassing, which contributes 20% of the global SO2 emission from volcanoes. Here we assess, from up-to-date and long-term observations, the SO2 emission of the Ambrym volcano, one of the dominant volcanoes in the MVA, and we investigate its role as sulfate precursor on the regional distribution of aerosols, using both satellite observations and model results at 1° × 1° spatial resolution from WRF-Chem/GOCART. Without considering aerosol forcing on clouds, our model parameterizations for convection, vertical mixing and cloud properties provide a reliable chemical weather representation, making possible a cross-examination of model solution and observations. This preliminary work enables the identification of biases and limitations affecting both the model (missing sources) and satellite sensors and algorithms (for aerosol detection and classification) and leads to the implementation of improved transport and aerosol processes in the modeling system. On the one hand, the model confirms a 50% underestimation of SO2 emissions due to satellite swath sampling of the Ozone Monitoring Instrument (OMI), consistent with field studies. The OMI irregular sampling also produces a level of noise that impairs its monitoring capacity during short-term volcanic events. On the other hand, the model reveals a large sensitivity on aerosol composition and Aerosol Optical Depth (AOD) due to choices of both the source function in WRF-Chem and size parameters for sea-salt in FlexAOD, the post-processor used to compute offline the simulated AOD. We then proceed to diagnosing the role of SO2 volcanic emission in the regional aerosol composition. The model shows that both dynamics and cloud properties associated with the South Pacific Convergence Zone (SPCZ) have a large influence on the oxidation of SO2 and on the transport pathways of

  13. IR spectral analysis for the diagnostics of crust earthquake precursors

    Directory of Open Access Journals (Sweden)

    R. M. Umarkhodgaev

    2012-11-01

    Full Text Available Some possible physical processes are analysed that cause, under the condition of additional ionisation in a pre-breakdown electric field, emissions in the infrared (IR interval. The atmospheric transparency region of the IR spectrum at wavelengths of 7–15 μm is taken into account. This transparency region corresponds to spectral lines of small atmospheric constituents like CH4, CO2, N2O, NO2, NO, and O3. The possible intensities of the IR emissions observable in laboratories and in nature are estimated. The acceleration process of the electrons in the pre-breakdown electrical field before its adhesion to the molecules is analyzed. For daytime conditions, modifications of the adsorption spectra of the scattered solar emissions are studied; for nighttime, variations of emission spectra may be used for the analysis.

  14. Distribution of precursor amyloid-β-protein messenger RNA in human cerebral cortex: relationship to neurofibrillary tangles and neuritic plaques

    International Nuclear Information System (INIS)

    Lewis, D.A.; Higgins, G.A.; Young, W.G.; Goldgaber, D.; Gajdusek, D.C.; Wilson, M.C.; Morrison, J.H.

    1988-01-01

    Neurofibrillary tangles (NFT) and neuritic plaques (NP), two neuropathological markers of Alzheimer disease, may both contain peptide fragments derived from the human amyloid β protein. However, the nature of the relationship between NFT and NP and the source of the amyloid β proteins found in each have remained unclear. The authors used in situ hybridization techniques to map the anatomical distribution of precursor amyloid-β-protein mRNA in the neocortex of brains from three subjects with no known neurologic disease and from five patients with Alzheimer disease. In brains from control subjects, positively hybridizing neurons were present in cortical regions and layers that contain a high density of neuropathological markers in Alzheimer disease, as well as in those loci that contain NP but few NFT. Quantitative analyses of in situ hybridization patterns within layers III and V of the superior frontal cortex revealed that the presence of high numbers of NFT in Alzheimer-diseased brains was associated with a decrease in the number of positively hybridizing neurons compared to controls and Alzheimer-diseased brains with few NFT. These findings suggest that the expression of precursor amyloid-β-protein mRNA may be a necessary but is clearly not a sufficient prerequisite for NFT formation. In addition, these results may indicate that the amyloid β protein, present in NP in a given region or layer of cortex, is not derived from the resident neuronal cell bodies that express the mRNA for the precursor protein

  15. Analysis of the Precursors, Simulants and Degradation Products of Chemical Warfare Agents.

    Science.gov (United States)

    Witkiewicz, Zygfryd; Neffe, Slawomir; Sliwka, Ewa; Quagliano, Javier

    2018-09-03

    Recent advances in analysis of precursors, simulants and degradation products of chemical warfare agents (CWA) are reviewed. Fast and reliable analysis of precursors, simulants and CWA degradation products is extremely important at a time, when more and more terrorist groups and radical non-state organizations use or plan to use chemical weapons to achieve their own psychological, political and military goals. The review covers the open source literature analysis after the time, when the chemical weapons convention had come into force (1997). The authors stated that during last 15 years increased number of laboratories are focused not only on trace analysis of CWA (mostly nerve and blister agents) in environmental and biological samples, but the growing number of research are devoted to instrumental analysis of precursors and degradation products of these substances. The identification of low-level concentration of CWA degradation products is often more important and difficult than the original CWA, because of lower level of concentration and a very large number of compounds present in environmental and biological samples. Many of them are hydrolysis products and are present in samples in the ionic form. For this reason, two or three instrumental methods are used to perform a reliable analysis of these substances.

  16. Laser damage in optical components: metrology, statistical and photo-induced analysis of precursor centres

    International Nuclear Information System (INIS)

    Gallais, L.

    2002-11-01

    This thesis deals with laser damage phenomena for nanosecond pulses, in optical components such as glasses, dielectric and metallic thin films. Firstly, a work is done on the laser damage metrology, in order to obtain accurate and reliable measurement of laser-induced damage probabilities, with a rigorous control of test parameters. Then, with the use of a specific model, we find densities of laser damage precursors in the case of bulk glasses (few tens by (100μm) 3 ) and in the case of glass surfaces (one precursor by μm 3 ). Our analysis is associated to morphology studies by Atomic Force Microscope to discuss about precursor nature and damage process. Influence of wavelength (from 355 to 1064 nm) and cumulated shots is also studied. Simulations are performed to study initiation mechanisms on these inclusions. This work gives an estimation of complex index and size of the precursor, which permits to discuss about possible detection by non-destructive tools. (author)

  17. Effect of precursor solutions stirring on deep level defects concentration and spatial distribution in low temperature aqueous chemical synthesis of zinc oxide nanorods

    Directory of Open Access Journals (Sweden)

    Hatim Alnoor

    2015-08-01

    Full Text Available Hexagonal c-axis oriented zinc oxide (ZnO nanorods (NRs with 120-300 nm diameters are synthesized via the low temperature aqueous chemical route at 80 °C on silver-coated glass substrates. The influence of varying the precursor solutions stirring durations on the concentration and spatial distributions of deep level defects in ZnO NRs is investigated. Room temperature micro-photoluminesnce (μ-PL spectra were collected for all samples. Cathodoluminescence (CL spectra of the as-synthesized NRs reveal a significant change in the intensity ratio of the near band edge emission (NBE to the deep-level emission (DLE peaks with increasing stirring durations. This is attributed to the variation in the concentration of the oxygen-deficiency with increasing stirring durations as suggested from the X-ray photoelectron spectroscopy analysis. Spatially resolved CL spectra taken along individual NRs revealed that stirring the precursor solutions for relatively short duration (1-3 h, which likely induced high super saturation under thermodynamic equilibrium during the synthesis process, is observed to favor the formation of point defects moving towards the tip of the NRs. In contrary, stirring for longer duration (5-15 h will induce low super saturation favoring the formation of point defects located at the bottom of the NRs. These findings demonstrate that it is possible to control the concentration and spatial distribution of deep level defects in ZnO NRs by varying the stirring durations of the precursor solutions.

  18. Radial transport processes as a precursor to particle deposition in drinking water distribution systems.

    Science.gov (United States)

    van Thienen, P; Vreeburg, J H G; Blokker, E J M

    2011-02-01

    Various particle transport mechanisms play a role in the build-up of discoloration potential in drinking water distribution networks. In order to enhance our understanding of and ability to predict this build-up, it is essential to recognize and understand their role. Gravitational settling with drag has primarily been considered in this context. However, since flow in water distribution pipes is nearly always in the turbulent regime, turbulent processes should be considered also. In addition to these, single particle effects and forces may affect radial particle transport. In this work, we present an application of a previously published turbulent particle deposition theory to conditions relevant for drinking water distribution systems. We predict quantitatively under which conditions turbophoresis, including the virtual mass effect, the Saffman lift force, and the Magnus force may contribute significantly to sediment transport in radial direction and compare these results to experimental observations. The contribution of turbophoresis is mostly limited to large particles (>50 μm) in transport mains, and not expected to play a major role in distribution mains. The Saffman lift force may enhance this process to some degree. The Magnus force is not expected to play any significant role in drinking water distribution systems. © 2010 Elsevier Ltd. All rights reserved.

  19. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  20. Size distribution of silver nanoclusters induced by ion, electron, laser beams and thermal treatments of an organometallic precursor

    International Nuclear Information System (INIS)

    D'Urso, L.; Nicolosi, V.; Compagnini, G.; Puglisi, O.

    2004-01-01

    Recently, a huge variety of physical and chemical synthetic processes have been reported to prepare nanostructured materials made of very small (diameter<50 nm) metallic clusters. Depending on the nature of clusters, this new kind of materials posses interesting properties (electronic, optical, magnetic, catalytic) that can be tailored as a function of the particles size and shape. Silver nanoparticles have been obtained by direct thermal treatment or by beam-enhanced decomposition (ion, electron and laser) of a silver organometallic compound (precursor) spinned onto suitable substrates. In this paper, we present the results of a study on the size distribution of such nanoparticles as a function of the different synthesis methods. It was found that the methods employed strongly affect the silver nanoparticles formation. Smaller silver nanoclusters were obtained after reduction by ion beam irradiation and thermal treatment, as observed by using different techniques (AFM, XRD and UV-Vis)

  1. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  2. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  3. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  4. Associations between maternal and paternal parenting behaviors, anxiety and its precursors in early childhood: A meta-analysis

    NARCIS (Netherlands)

    Möller, E.L.; Nikolić, M.; Majdandžić, M.; Bögels, S.M.

    2016-01-01

    In this meta-analysis we investigated differential associations between maternal and paternal parenting behaviors (overcontrol, overprotection, overinvolvement, autonomy granting, challenging parenting) and anxiety and its precursors (fearful temperament, behavioral inhibition, shyness) in children

  5. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  6. Functional Analysis of Precursors for Serious Problem Behavior and Related Intervention

    Science.gov (United States)

    Langdon, Nancy A.; Carr, Edward G.; Owen-DeSchryver, Jamie S.

    2008-01-01

    Precursor behaviors are innocuous behaviors that reliably precede the occurrence of problem behavior. Intervention efforts applied to precursors might prevent the occurrence of severe problem behavior. We examined the relationship between precursor behavior and problem behavior in three individuals with developmental disabilities. First, a…

  7. Thermogravimetric analysis of silicon carbide-silicon nitride polycarbosilazane precursor during pyrolysis from ambient to 1000 C

    Science.gov (United States)

    Ledbetter, F. E., III; Daniels, J. G.; Clemons, J. M.; Hundley, N. H.; Penn, B. G.

    1984-01-01

    Thermogravimetric analysis data are presented on the unmeltable polycarbosilazane precursor of silicon carbide-silicon nitride fibers, over the room temperature-1000 C range in a nitrogen atmosphere, in order to establish the weight loss at various temperatures during the precursor's pyrolysis to the fiber material. The fibers obtained by this method are excellent candidates for use in applications where the oxidation of carbon fibers (above 400 C) renders them unsuitable.

  8. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  9. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  10. Bioinformatic evidence for a widely distributed, ribosomally produced electron carrier precursor, its maturation proteins, and its nicotinoprotein redox partners

    Directory of Open Access Journals (Sweden)

    Haft Daniel H

    2011-01-01

    Full Text Available Abstract Background Enzymes in the radical SAM (rSAM domain family serve in a wide variety of biological processes, including RNA modification, enzyme activation, bacteriocin core peptide maturation, and cofactor biosynthesis. Evolutionary pressures and relationships to other cellular constituents impose recognizable grammars on each class of rSAM-containing system, shaping patterns in results obtained through various comparative genomics analyses. Results An uncharacterized gene cluster found in many Actinobacteria and sporadically in Firmicutes, Chloroflexi, Deltaproteobacteria, and one Archaeal plasmid contains a PqqE-like rSAM protein family that includes Rv0693 from Mycobacterium tuberculosis. Members occur clustered with a strikingly well-conserved small polypeptide we designate "mycofactocin," similar in size to bacteriocins and PqqA, precursor of pyrroloquinoline quinone (PQQ. Partial Phylogenetic Profiling (PPP based on the distribution of these markers identifies the mycofactocin cluster, but also a second tier of high-scoring proteins. This tier, strikingly, is filled with up to thirty-one members per genome from three variant subfamilies that occur, one each, in three unrelated classes of nicotinoproteins. The pattern suggests these variant enzymes require not only NAD(P, but also the novel gene cluster. Further study was conducted using SIMBAL, a PPP-like tool, to search these nicotinoproteins for subsequences best correlated across multiple genomes to the presence of mycofactocin. For both the short chain dehydrogenase/reductase (SDR and iron-containing dehydrogenase families, aligning SIMBAL's top-scoring sequences to homologous solved crystal structures shows signals centered over NAD(P-binding sites rather than over substrate-binding or active site residues. Previous studies on some of these proteins have revealed a non-exchangeable NAD cofactor, such that enzymatic activity in vitro requires an artificial electron acceptor such

  11. Transcriptome analysis of bitter acid biosynthesis and precursor pathways in hop (Humulus lupulus

    Directory of Open Access Journals (Sweden)

    Clark Shawn M

    2013-01-01

    Full Text Available Abstract Background Bitter acids (e.g. humulone are prenylated polyketides synthesized in lupulin glands of the hop plant (Humulus lupulus which are important contributors to the bitter flavour and stability of beer. Bitter acids are formed from acyl-CoA precursors derived from branched-chain amino acid (BCAA degradation and C5 prenyl diphosphates from the methyl-D-erythritol 4-phosphate (MEP pathway. We used RNA sequencing (RNA-seq to obtain the transcriptomes of isolated lupulin glands, cones with glands removed and leaves from high α-acid hop cultivars, and analyzed these datasets for genes involved in bitter acid biosynthesis including the supply of major precursors. We also measured the levels of BCAAs, acyl-CoA intermediates, and bitter acids in glands, cones and leaves. Results Transcripts encoding all the enzymes of BCAA metabolism were significantly more abundant in lupulin glands, indicating that BCAA biosynthesis and subsequent degradation occurs in these specialized cells. Branched-chain acyl-CoAs and bitter acids were present at higher levels in glands compared with leaves and cones. RNA-seq analysis showed the gland-specific expression of the MEP pathway, enzymes of sucrose degradation and several transcription factors that may regulate bitter acid biosynthesis in glands. Two branched-chain aminotransferase (BCAT enzymes, HlBCAT1 and HlBCAT2, were abundant, with gene expression quantification by RNA-seq and qRT-PCR indicating that HlBCAT1 was specific to glands while HlBCAT2 was present in glands, cones and leaves. Recombinant HlBCAT1 and HlBCAT2 catalyzed forward (biosynthetic and reverse (catabolic reactions with similar kinetic parameters. HlBCAT1 is targeted to mitochondria where it likely plays a role in BCAA catabolism. HlBCAT2 is a plastidial enzyme likely involved in BCAA biosynthesis. Phylogenetic analysis of the hop BCATs and those from other plants showed that they group into distinct biosynthetic (plastidial and

  12. Analysis of Precursors Prior to Rock Burst in Granite Tunnel Using Acoustic Emission and Far Infrared Monitoring

    Directory of Open Access Journals (Sweden)

    Zhengzhao Liang

    2013-01-01

    Full Text Available To understand the physical mechanism of the anomalous behaviors observed prior to rock burst, the acoustic emission (AE and far infrared (FIR techniques were applied to monitor the progressive failure of a rock tunnel model subjected to biaxial stresses. Images of fracturing process, temperature changes of the tunnel, and spatiotemporal serials of acoustic emission were simultaneously recorded during deformation of the model. The b-value derived from the amplitude distribution data of AE was calculated to predict the tunnel rock burst. The results showed that the vertical stress enhanced the stability of the tunnel, and the tunnels with higher confining pressure demonstrated a more abrupt and strong rock burst. Abnormal temperature changes around the wall were observed prior to the rock burst of the tunnel. Analysis of the AE events showed that a sudden drop and then a quiet period could be considered as the precursors to forecast the rock burst hazard. Statistical analysis indicated that rock fragment spalling occurred earlier than the abnormal temperature changes, and the abnormal temperature occurred earlier than the descent of the AE b-value. The analysis indicated that the temperature changes were more sensitive than the AE b-value changes to predict the tunnel rock bursts.

  13. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    Science.gov (United States)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  14. Confirmatory Factor Analysis of the Trinity Inventory of Precursors to Suicide (TIPS) and Its Relationship to Hopelessness and Depression

    Science.gov (United States)

    Smyth, Caroline L.; MacLachlan, Malcolm

    2005-01-01

    Numerous existing measures assess attitudes toward suicide yet fail to account for contextual factors. The Trinity Inventory of Precursors to Suicide (TIPS) is presented as an alternative, with implications for the development of prevention programs. Having previously reported exploratory analysis of the TIPS; confirmatory factor analysis and…

  15. Migration and distribution of two populations of hippocampal granule cell precursors during the perinatal and postnatal periods

    International Nuclear Information System (INIS)

    Altman, J.; Bayer, S.A.

    1990-01-01

    Methacrylate-embedded sections and short-survival thymidine radiograms of the hippocampal dentate gyrus were examined in perinatal and postnatal rats in order to trace the site of origin and migration of the precursors of granule cells and study the morphogenesis of the granular layer. The densely packed, spindle-shaped cells of the secondary dentate matrix (a derivative of the primary dentate neuroepithelium) stream in a subpial position towards the granular layer of the internal dentate limb during the perinatal and early postnatal periods. By an accretionary process, the crest of the granular layer forms on day E21 and on the subsequent days the granular layer of the internal dentate limb expands progressively in a lateral direction. Granule cells differentiation, as judged by the transformation of polymorph, darkly staining small cells into rounder, lightly staining larger granule cells, follows the same gradient from the external dentate limb to the internal dentate limb. The secondary dentate matrix is in a process of dissolution by day P5. This matrix is the source of what will later become the outer shell of the granular layer composed of early generated granule cells. The thicker inner shell of the granular layer, formed during the infantile and juvenile periods, derives from an intrinsic, tertiary germinal matrix. On day E22, the dentate migration of the secondary dentate matrix becomes partitioned into two components: (a) the subpial component of extradentate origin, referred to in this context as the first dentate migration, and (b) the second dentate migration. The latter is distributed in the basal polymorph layer throughout the entire dentate gyrus and is henceforth recognized as the tertiary dentate matrix. The tertiary dentate matrix is prominent between days P3 and P10

  16. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  17. Analysis of Precursor Properties of mixed Al/Alumel Cylindrical Wire Arrays*

    Science.gov (United States)

    Stafford, A.; Safronova, A. S.; Kantsyrev, V. L.; Esaulov, A. A.; Weller, M. E.; Shrestha, I.; Osborne, G. C.; Shlyaptseva, V. V.; Keim, S. F.; Coverdale, C. A.; Chuvatin, A. S.

    2012-10-01

    Previous studies of mid-Z (Cu and Ni) cylindrical wire arrays (CWAs) on Zebra have found precursors with high electron temperatures of >300 eV. However, past experiments with Al CWAs did not find the same high temperature precursors. New precursor experiments using mixed Al/Alumel (Ni 95%, Si 2%, and Al 2%) cylindrical wire arrays have been performed to understand how the properties of L-shell Ni precursor will change and whether Al precursor will be observed. Time gated spectra and pinholes are used to determine precursor plasma conditions for comparison with previous Alumel precursor experiments. A full diagnostic set which included more than ten different beam-lines was implemented. Future work in this direction is discussed. [4pt] *This work was supported by NNSA under DOE Cooperative Agreements DE-FC52-06NA27588, and in part by DE-FC52-06NA27586, and DE-FC52-06NA27616. Sandia is a multi-program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.

  18. Comparative proteomics analysis of engineered Saccharomyces cerevisiae with enhanced biofuel precursor production.

    Directory of Open Access Journals (Sweden)

    Xiaoling Tang

    Full Text Available The yeast Saccharomyces cerevisiae was metabolically modified for enhanced biofuel precursor production by knocking out genes encoding mitochondrial isocitrate dehydrogenase and over-expression of a heterologous ATP-citrate lyase. A comparative iTRAQ-coupled 2D LC-MS/MS analysis was performed to obtain a global overview of ubiquitous protein expression changes in S. cerevisiae engineered strains. More than 300 proteins were identified. Among these proteins, 37 were found differentially expressed in engineered strains and they were classified into specific categories based on their enzyme functions. Most of the proteins involved in glycolytic and pyruvate branch-point pathways were found to be up-regulated and the proteins involved in respiration and glyoxylate pathway were however found to be down-regulated in engineered strains. Moreover, the metabolic modification of S. cerevisiae cells resulted in a number of up-regulated proteins involved in stress response and differentially expressed proteins involved in amino acid metabolism and protein biosynthesis pathways. These LC-MS/MS based proteomics analysis results not only offered extensive information in identifying potential protein-protein interactions, signal pathways and ubiquitous cellular changes elicited by the engineered pathways, but also provided a meaningful biological information platform serving further modification of yeast cells for enhanced biofuel production.

  19. Precursors of nitrogenous disinfection by-products in drinking water--A critical review and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bond, Tom [Department of Civil and Environmental Engineering, Imperial College London, London SW7 2AZ (United Kingdom); Templeton, Michael R.; Graham, Nigel [Department of Civil and Environmental Engineering, Imperial College London, London SW7 2AZ (United Kingdom)

    2012-10-15

    Highlights: Black-Right-Pointing-Pointer The proportion of N-DBP formation attributable to specific precursors was calculated. Black-Right-Pointing-Pointer Precursor concentrations are typically insufficient to account for observed N-DBP formation, except CNX and NDMA. Black-Right-Pointing-Pointer Amino acid precursors are easier to remove during water treatment than suggested by laboratory studies. - Abstract: In recent years research into the formation of nitrogenous disinfection by-products (N-DBPs) in drinking water - including N-nitrosodimethylamine (NDMA), the haloacetonitriles (HANs), haloacetamides (HAcAms), cyanogen halides (CNX) and halonitromethanes (HNMs) - has proliferated. This is partly due to their high reported toxicity of N-DBPs. In this review paper information about the formation yields of N-DBPs from model precursors, and about environmental precursor occurrence, has been employed to assess the amount of N-DBP formation that is attributable to known precursors. It was calculated that for HANs and HAcAms, the concentrations of known precursors - mainly free amino acids are insufficient to account for the observed concentrations of these N-DBP groups. However, at least in some waters, a significant proportion of CNX and NDMA formation can be explained by known precursors. Identified N-DBP precursors tend to be of low molecular weight and low electrostatic charge relative to bulk natural organic matter (NOM). This makes them recalcitrant to removal by water treatment processes, notably coagulation, as confirmed by a number of bench-scale studies. However, amino acids have been found to be easier to remove during water treatment than would be suggested by the known molecular properties of the individual free amino acids.

  20. Distribution and urban-suburban differences in ground-level ozone and its precursors over Shenyang, China

    Science.gov (United States)

    Liu, Ningwei; Ren, Wanhui; Li, Xiaolan; Ma, Xiaogang; Zhang, Yunhai; Li, Bingkun

    2018-03-01

    Hourly mixing ratio data of ground-level ozone and its main precursors at ambient air quality monitoring sites in Shenyang during 2013-2015 were used to survey spatiotemporal variations in ozone. Then, the transport of ozone and its precursors among urban, suburban, and rural sites was examined. The correlations between ozone and some key meteorological factors were also investigated. Ozone and O x mixing ratios in Shenyang were higher during warm seasons and lower during cold ones, while ozone precursors followed the opposite cycle. Ozone mixing ratios reached maximum and minimum values in the afternoon and morning, respectively, reflecting the significant influence of photochemical production during daytime and depletion via titration during nighttime. Compared to those in downtown Shenyang, ozone mixing ratios were higher and the occurrence of peak values were later in suburban and rural areas downwind of the prevailing wind. The differences were most significant in summer, when the ozone mixing ratios at one suburban downwind site reached a maximum value of 35.6 ppb higher than those at the downtown site. This suggests that photochemical production processes were significant during the transport of ozone precursors, particularly in warm seasons with sufficient sunlight. Temperature, total radiation, and wind speed all displayed positive correlations with ozone concentration, reflecting their important role in accelerating ozone formation. Generally, the correlations between ozone and meteorological factors were slightly stronger at suburban sites than in urban areas, indicating that ozone levels in suburban areas were more sensitive to these meteorological factors.

  1. Investigations of temporal and spatial distribution of precursors SO2 and NO2 vertical columns in the North China Plain using mobile DOAS

    Science.gov (United States)

    Wu, Fengcheng; Xie, Pinhua; Li, Ang; Mou, Fusheng; Chen, Hao; Zhu, Yi; Zhu, Tong; Liu, Jianguo; Liu, Wenqing

    2018-02-01

    Recently, Chinese cities have suffered severe events of haze air pollution, particularly in the North China Plain (NCP). Investigating the temporal and spatial distribution of pollutants, emissions, and pollution transport is necessary to better understand the effect of various sources on air quality. We report on mobile differential optical absorption spectroscopy (mobile DOAS) observations of precursors SO2 and NO2 vertical columns in the NCP in the summer of 2013 (from 11 June to 7 July) in this study. The different temporal and spatial distributions of SO2 and NO2 vertical column density (VCD) over this area are characterized under various wind fields. The results show that transport from the southern NCP strongly affects air quality in Beijing, and the transport route, particularly SO2 transport on the route of Shijiazhuang-Baoding-Beijing, is identified. In addition, the major contributors to SO2 along the route of Shijiazhuang-Baoding-Beijing are elevated sources compared to low area sources for the route of Dezhou-Cangzhou-Tianjin-Beijing; this is found using the interrelated analysis between in situ and mobile DOAS observations during the measurement periods. Furthermore, the discussions on hot spots near the city of JiNan show that average observed width of polluted air mass is 11.83 and 17.23 km associated with air mass diffusion, which is approximately 60 km away from emission sources based on geometrical estimation. Finally, a reasonable agreement exists between the Ozone Monitoring Instrument (OMI) and mobile DOAS observations, with a correlation coefficient (R2) of 0.65 for NO2 VCDs. Both datasets also have a similar spatial pattern. The fitted slope of 0.55 is significantly less than unity, which can reflect the contamination of local sources, and OMI observations are needed to improve the sensitivities to the near-surface emission sources through improvements of the retrieval algorithm or the resolution of satellites.

  2. Use of one delayed-neutron precursor group in transient analysis

    International Nuclear Information System (INIS)

    Diamond, D.J.

    1983-01-01

    In most reactor dynamics calculations six groups of delayed-neutron precursors are usually accounted for. However, under certain circumstances it may be advantageous to simplify the calculation and utilize a single delayed-neutron group. The motivation for going to one precursor group is economy. For LWR transient codes that use point kinetics the equations are solved very rapidly and six precursor groups should always be used. However, codes with spatially dependent neutron kinetics are very long running and the use of one precursor group may save computer costs and not impair the accuracy of the results significantly. Furthermore, in some codes, the elimation of five presursor groups makes additional memory available which may be used to give a net increase in the accuracy of the calculations, e.g., by allowing for an increase in mesh density. In order to use one delayed neutron precursor group it is necessary to derive a single decay constant, 6 lambda-, which, along with the total (or one group) delayed neutron fraction β = Σ/sub i = 1/β/sub i/, will adequately describe the transeint precursor behavior. The present summary explains how a recommendation for lambda- was derived

  3. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  4. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  5. Distribution of 32P-phosphate in guinea pig central nervous system after intraventricular administration of precursor

    International Nuclear Information System (INIS)

    Bukovsky, V.; Hage Ali, M.A.; Mezes, V.

    1982-01-01

    Within 20 minutes after administration into the right lateral ventricle, precursor balance is achieved in both cerebral hemispheres, the brain stem and the cerebellum. A different course was observed in the medulla oblongata and the spinal cord. Twenty minutes following administration of the precursor, the amount of radioactivity decreased in the following order of sequence: the right hemisphere, the brain stem, the cerebellum, the left hemisphere, the medulla oblongata, and the spinal cord. Of the total amount of phosphates in the acid-soluble fraction of the whole brain and cerebellum, inorganic phosphates constxtute 50%, and there are no differences between the two parts. 60 minutes after administration of the labelled phosphate, the specific activity of total phosphates in the acid-soluble fraction of the brain and cerebellum is higher than the specific activity of inorganic phosphates. (author)

  6. Characterizing Submonolayer Growth of 6P on Mica: Capture Zone Distributions vs. Growth Exponents and the Role of Hot Precursors

    Science.gov (United States)

    Einstein, T. L.; Morales-Cifuentes, Josue; Pimpinelli, Alberto

    2015-03-01

    Analyzing capture-zone distributions (CZD) using the generalized Wigner distribution (GWD) has proved a powerful way to access the critical nucleus size i. Of the several systems to which the GWD has been applied, we consider 6P on mica, for which Winkler's group found i ~ 3 . Subsequently they measured the growth exponent α (island density ~Fα , for flux F) of this system and found good scaling but different values at small and large F, which they attributed to DLA and ALA dynamics, but with larger values of i than found from the CZD analysis. We investigate this result in some detail. The third talk of this group describes a new universal relation between α and the characteristic exponent β of the GWD. The second talk reports the results of a proposed model that takes long-known transient ballistic adsorption into account, for the first time in a quantitative way. We find several intermediate scaling regimes, with distinctive values of α and an effective activation energy. One of these, rather than ALA, gives the best fit of the experimental data and a value of i consistent with the CZD analysis. Work at UMD supported by NSF CHE 13-05892.

  7. Metabolomics Analysis of the Toxic Effects of the Production of Lycopene and Its Precursors

    Directory of Open Access Journals (Sweden)

    April M. Miguez

    2018-05-01

    Full Text Available Using cells as microbial factories enables highly specific production of chemicals with many advantages over chemical syntheses. A number of exciting new applications of this approach are in the area of precision metabolic engineering, which focuses on improving the specificity of target production. In recent work, we have used precision metabolic engineering to design lycopene-producing Escherichia coli for use as a low-cost diagnostic biosensor. To increase precursor availability and thus the rate of lycopene production, we heterologously expressed the mevalonate pathway. We found that simultaneous induction of these pathways increases lycopene production, but induction of the mevalonate pathway before induction of the lycopene pathway decreases both lycopene production and growth rate. Here, we aim to characterize the metabolic changes the cells may be undergoing during expression of either or both of these heterologous pathways. After establishing an improved method for quenching E. coli for metabolomics analysis, we used two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS to characterize the metabolomic profile of our lycopene-producing strains in growth conditions characteristic of our biosensor application. We found that the metabolic impacts of producing low, non-toxic levels of lycopene are of much smaller magnitude than the typical metabolic changes inherent to batch growth. We then used metabolomics to study differences in metabolism caused by the time of mevalonate pathway induction and the presence of the lycopene biosynthesis genes. We found that overnight induction of the mevalonate pathway was toxic to cells, but that the cells could recover if the lycopene pathway was not also heterologously expressed. The two pathways appeared to have an antagonistic metabolic effect that was clearly reflected in the cells’ metabolic profiles. The metabolites homocysteine and homoserine exhibited particularly interesting

  8. Analysis of several irdoid and indole precursors of terpenoid indole alkaloids with a single HPLC run

    DEFF Research Database (Denmark)

    Dagnino, Denise; Schripsema, Jan; Verpoorte, Robert

    1996-01-01

    An isocratic HPLC system is described which allows the separation of the iridoid and indole precursors of terpenoid indole alkaloids, which are present in a single crude extract. The system consists of a column of LiChrospher 60 RP select B 5 my, 250x4 mm (Merck) with an eluent of 1 % formic acid...

  9. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  10. Analysis of root causes of major hazard precursors (hydrocarbon leaks) in the Norwegian offshore petroleum industry

    International Nuclear Information System (INIS)

    Vinnem, Jan Erik; Hestad, Jon Andreas; Kvaloy, Jan Terje; Skogdalen, Jon Espen

    2010-01-01

    The offshore petroleum industry in Norway reports major hazard precursors to the authorities, and data are available for the period 1996 through 2009. Barrier data have been reported since 2002, as have data from an extensive questionnaire survey covering working environment, organizational culture and perceived risk among all employees on offshore installations. Several attempts have been made to analyse different data sources in order to discover relations that may cast some light on possible root causes of major hazard precursors. These previous attempts were inconclusive. The study presented in this paper is the most extensive study performed so far. The data were analysed using linear regression. The conclusion is that there are significant correlations between number of leaks and safety climate indicators. The discussion points to possible root causes of major accidents.

  11. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  12. Accident Sequence Precursor Analysis for SGTR by Using Dynamic PSA Approach

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2016-01-01

    In order to address this issue, this study suggests the sequence tree model to analyze accident sequence systematically. Using the sequence tree model, all possible scenarios which need a specific safety action to prevent the core damage can be identified and success conditions of safety action under complicated situation such as combined accident will be also identified. Sequence tree is branch model to divide plant condition considering the plant dynamics. Since sequence tree model can reflect the plant dynamics, arising from interaction of different accident timing and plant condition and from the interaction between the operator action, mitigation system, and the indicators for operation, sequence tree model can be used to develop the dynamic event tree model easily. Target safety action for this study is a feed-and-bleed (F and B) operation. A F and B operation directly cools down the reactor cooling system (RCS) using the primary cooling system when residual heat removal by the secondary cooling system is not available. In this study, a TLOFW accident and a TLOFW accident with LOCA were the target accidents. Based on the conventional PSA model and indicators, the sequence tree model for a TLOFW accident was developed. Based on the results of a sampling analysis and data from the conventional PSA model, the CDF caused by Sequence no. 26 can be realistically estimated. For a TLOFW accident with LOCA, second accident timings were categorized according to plant condition. Indicators were selected as branch point using the flow chart and tables, and a corresponding sequence tree model was developed. If sampling analysis is performed, practical accident sequences can be identified based on the sequence analysis. If a realistic distribution for the variables can be obtained for sampling analysis, much more realistic accident sequences can be described. Moreover, if the initiating event frequency under a combined accident can be quantified, the sequence tree model

  13. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  14. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  15. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  16. Planar Cell Polarity Breaks the Symmetry of PAR Protein Distribution prior to Mitosis in Drosophila Sensory Organ Precursor Cells.

    Science.gov (United States)

    Besson, Charlotte; Bernard, Fred; Corson, Francis; Rouault, Hervé; Reynaud, Elodie; Keder, Alyona; Mazouni, Khalil; Schweisguth, François

    2015-04-20

    During development, cell-fate diversity can result from the unequal segregation of fate determinants at mitosis. Polarization of the mother cell is essential for asymmetric cell division (ACD). It often involves the formation of a cortical domain containing the PAR complex proteins Par3, Par6, and atypical protein kinase C (aPKC). In the fly notum, sensory organ precursor cells (SOPs) divide asymmetrically within the plane of the epithelium and along the body axis to generate two distinct cells. Fate asymmetry depends on the asymmetric localization of the PAR complex. In the absence of planar cell polarity (PCP), SOPs divide with a random planar orientation but still asymmetrically, showing that PCP is dispensable for PAR asymmetry at mitosis. To study when and how the PAR complex localizes asymmetrically, we have used a quantitative imaging approach to measure the planar polarization of the proteins Bazooka (Baz, fly Par3), Par6, and aPKC in living pupae. By using imaging of functional GFP-tagged proteins with image processing and computational modeling, we find that Baz, Par6, and aPKC become planar polarized prior to mitosis in a manner independent of the AuroraA kinase and that PCP is required for the planar polarization of Baz, Par6, and aPKC during interphase. This indicates that a "mitosis rescue" mechanism establishes asymmetry at mitosis in PCP mutants. This study therefore identifies PCP as the initial symmetry-breaking signal for the planar polarization of PAR proteins in asymmetrically dividing SOPs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Theory and use of GIRAFFE for analysis of decay characteristics of delayed-neutron precursors in an LMFBR

    International Nuclear Information System (INIS)

    Gross, K.C.

    1980-07-01

    The application of the computer code GIRAFFE (General Isotope Release Analysis For Failed Elements) written in FORTRAN IV is described. GIRAFFE was designed to provide parameter estimates of the nonlinear discrete-measurement models that govern the transport and decay of delayed-neutron precursors in a liquid-metal fast breeder reactor (LMFBR). The code has been organized into a set of small, relatively independent and well-defined modules to facilitate modification and maintenance. The program logic, the numerical techniques, and the methods of solution used by the code are presented, and the functions of the MAIN program and of each subroutine are discussed

  18. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  19. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  20. Hydroxide precursors to produce nanometric YCrO{sub 3}: Characterization and conductivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Durán, A., E-mail: dural@cnyn.unam.mx [Universidad Nacional Autónoma de México, Centro de Nanociencias y Nanotecnología, Km. 107 Carretera Tijuana-Ensenada, Apartado Postal 14, C.P. 22800, Ensenada, B.C. (Mexico); Meza F, C. [Universidad Nacional Autónoma de México, Centro de Nanociencias y Nanotecnología, Km. 107 Carretera Tijuana-Ensenada, Apartado Postal 14, C.P. 22800, Ensenada, B.C. (Mexico); Arizaga, Gregorio Guadalupe Carbajal, E-mail: gregoriocarbajal@yahoo.com.mx [Departamento de Química, Universidad de Guadalajara, Marcelino García Barragán 1421, C.P. 44430, Guadalajara, Jalisco (Mexico)

    2012-06-15

    Highlights: ► Y/Cr mixed hydroxide was precipitated with gaseous ammonia. ► The hydroxide treated at 1373 K formed YCrO{sub 3} crystals with 20 nm diameter. ► Electrical properties were different than those found in other methods of synthesis. ► E{sub act} suggests small-polarons as conduction mechanisms. -- Abstract: A precursor to produce perovskite-type YCrO{sub 3} was precipitated by bubbling gaseous ammonia into an yttrium/chromium salts solution. X-ray diffraction showed that the as-prepared powders were amorphous. Thermal treatment between 1273 and 1373 K, leads to formation of polycrystalline YCrO{sub 3} with crystal sizes around 20 nm. High resolution X-ray photoelectron spectra showed uniform chemical environment for yttrium and chromium in the amorphous hydroxide and crystalline YCrO{sub 3}. Shifts between Y 3d{sub 5/2} and Cr 2p{sub 3/2} binding energy suggest redistribution or charge transfer between yttrium and chromium ions in the YCrO{sub 3} structure. The electrical properties of YCrO{sub 3}, whose precursors were precipitated with gaseous ammonia are different than those prepared by combustion synthesis. Electrical conductivity presents a sudden increase at ∼473 K, which is associated to the grain size and morphology of the crystallites. The redistribution of charge between Y(III) and Cr(III) is thermally activated by the hopping of small-polarons, which are characterized by the Arrhenius law as the conductive mechanism.

  1. Hydroxide precursors to produce nanometric YCrO3: Characterization and conductivity analysis

    International Nuclear Information System (INIS)

    Durán, A.; Meza F, C.; Arizaga, Gregorio Guadalupe Carbajal

    2012-01-01

    Highlights: ► Y/Cr mixed hydroxide was precipitated with gaseous ammonia. ► The hydroxide treated at 1373 K formed YCrO 3 crystals with 20 nm diameter. ► Electrical properties were different than those found in other methods of synthesis. ► E act suggests small-polarons as conduction mechanisms. -- Abstract: A precursor to produce perovskite-type YCrO 3 was precipitated by bubbling gaseous ammonia into an yttrium/chromium salts solution. X-ray diffraction showed that the as-prepared powders were amorphous. Thermal treatment between 1273 and 1373 K, leads to formation of polycrystalline YCrO 3 with crystal sizes around 20 nm. High resolution X-ray photoelectron spectra showed uniform chemical environment for yttrium and chromium in the amorphous hydroxide and crystalline YCrO 3 . Shifts between Y 3d 5/2 and Cr 2p 3/2 binding energy suggest redistribution or charge transfer between yttrium and chromium ions in the YCrO 3 structure. The electrical properties of YCrO 3 , whose precursors were precipitated with gaseous ammonia are different than those prepared by combustion synthesis. Electrical conductivity presents a sudden increase at ∼473 K, which is associated to the grain size and morphology of the crystallites. The redistribution of charge between Y(III) and Cr(III) is thermally activated by the hopping of small-polarons, which are characterized by the Arrhenius law as the conductive mechanism.

  2. Rapid detection of sugar alcohol precursors and corresponding nitrate ester explosives using direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Sisco, Edward; Forbes, Thomas P

    2015-04-21

    This work highlights the rapid detection of nitrate ester explosives and their sugar alcohol precursors by direct analysis in real time mass spectrometry (DART-MS) using an off-axis geometry. Demonstration of the effect of various parameters, such as ion polarity and in-source collision induced dissociation (CID) on the detection of these compounds is presented. Sensitivity of sugar alcohols and nitrate ester explosives was found to be greatest in negative ion mode with sensitivities ranging from hundreds of picograms to hundreds of nanograms, depending on the characteristics of the particular molecule. Altering the in-source CID potential allowed for acquisition of characteristic molecular ion spectra as well as fragmentation spectra. Additional studies were completed to identify the role of different experimental parameters on the sensitivity for these compounds. Variables that were examined included the DART gas stream temperature, the presence of a related compound (i.e., the effect of a precursor on the detection of a nitrate ester explosive), incorporation of dopant species and the role of the analysis surface. It was determined that each variable affected the response and detection of both sugar alcohols and the corresponding nitrate ester explosives. From this work, a rapid and sensitive method for the detection of individual sugar alcohols and corresponding nitrate ester explosives, or mixtures of the two, has been developed, providing a useful tool in the real-world identification of homemade explosives.

  3. [ORION®: a simple and effective method for systemic analysis of clinical events and precursors occurring in hospital practice].

    Science.gov (United States)

    Debouck, F; Rieger, E; Petit, H; Noël, G; Ravinet, L

    2012-05-01

    Morbimortality review is now recommended by the French Health Authority (Haute Autorité de santé [HAS]) in all hospital settings. It could be completed by Comités de retour d'expérience (CREX), making systemic analysis of event precursors which may potentially result in medical damage. As commonly captured by their current practice, medical teams may not favour systemic analysis of events occurring in their setting. They require an easy-to-use method, more or less intuitive and easy-to-learn. It is the reason why ORION(®) has been set up. ORION(®) is based on experience acquired in aeronautics which is the main precursor in risk management since aircraft crashes are considered as unacceptable even though the mortality from aircraft crashes is extremely low compared to the mortality from medical errors in hospital settings. The systemic analysis is divided in six steps: (i) collecting data, (ii) rebuilding the chronology of facts, (iii) identifying the gaps, (iv) identifying contributing and influential factors, (v) proposing actions to put in place, (vi) writing the analysis report. When identifying contributing and influential factors, four kinds of factors favouring the event are considered: technical domain, working environment, organisation and procedures, human factors. Although they are essentials, human factors are not always considered correctly. The systemic analysis is done by a pilot, chosen among people trained to use the method, querying information from all categories of people acting in the setting. ORION(®) is now used in more than 400 French hospital settings for systemic analysis of either morbimortality cases or event precursors. It is used, in particular, in 145 radiotherapy centres for supporting CREX. As very simple to use and quasi-intuitive, ORION(®) is an asset to reach the objectives defined by HAS: to set up effective morbi-mortality reviews (RMM) and CREX for improving the quality of care in hospital settings. By helping the

  4. ORIONR: A simple and effective method for systemic analysis of clinical events and precursors occurring in hospital practice

    International Nuclear Information System (INIS)

    Debouck, F.; Petit, H.; Ravinet, L.; Rieger, E.; Noel, G.

    2012-01-01

    Purpose. - Morbi-mortality review is now recommended by the French Health Authority (Haute Autorite de sante [HAS]) in all hospital settings. It could be completed by Comites de retour d'experience (CREX), making systemic analysis of event precursors which may potentially result in medical damage. As commonly captured by their current practice, medical teams may not favour systemic analysis of events occurring in their setting. They require an easy-to-use method, more or less intuitive and easy-to-learn. It is the reason why ORION R has been set up. Methods. - ORION R is based on experience acquired in aeronautics which is the main precursor in risk management since aircraft crashes are considered as unacceptable even though the mortality from aircraft crashes is extremely low compared to the mortality from medical errors in hospital settings. The systemic analysis is divided in six steps: (i) collecting data, (ii) rebuilding the chronology of facts, (iii) identifying the gaps, (iv) identifying contributing and influential factors, (v) proposing actions to put in place, (vi) writing the analysis report. When identifying contributing and influential factors, four kinds of factors favouring the event are considered: technical domain, working environment, organisation and procedures, human factors. Although they are essentials, human factors are not always considered correctly. The systemic analysis is done by a pilot, chosen among people trained to use the method, querying information from all categories of people acting in the setting. Results. - ORION R is now used in more than 400 French hospital settings for systemic analysis of either morbi-mortality cases or event precursors. It is used, in particular, in 145 radiotherapy centres for supporting CREX. Conclusion. - As very simple to use and quasi-intuitive, ORION R is an asset to reach the objectives defined by HAS: to set up effective morbi-mortality reviews (RMM) and CREX for improving the quality of care in

  5. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  6. Associations between maternal and paternal parenting behaviors, anxiety and its precursors in early childhood: A meta-analysis.

    Science.gov (United States)

    Möller, Eline L; Nikolić, Milica; Majdandžić, Mirjana; Bögels, Susan M

    2016-04-01

    In this meta-analysis we investigated differential associations between maternal and paternal parenting behaviors (overcontrol, overprotection, overinvolvement, autonomy granting, challenging parenting) and anxiety and its precursors (fearful temperament, behavioral inhibition, shyness) in children (0-5years). Two meta-analyses were conducted, one for mothers (k=28, N=5,728), and one for fathers (k=12, N=1,019). In general, associations between parenting and child anxiety were small. Associations between child anxiety and overcontrol, overprotection, and overinvolvement did not differ for mothers and fathers. Maternal autonomy granting was not significantly related to child anxiety, and no studies examined fathers' autonomy granting. A significant difference was found for challenging parenting; mothers' challenging parenting was not significantly related to child anxiety, whereas fathers' challenging parenting was related to less child anxiety. Post-hoc meta-analyses revealed that mothers' and fathers' parenting was more strongly related to children's anxiety symptoms than to child anxiety precursors. Moreover, the association between parenting and child anxiety symptoms was stronger for fathers than for mothers. In conclusion, although parenting plays only a small role in early childhood anxiety, fathers' parenting is at least as important as mothers'. Paternal challenging behavior even seems more important than maternal challenging behavior. Research is needed to determine whether challenging fathering can prevent child anxiety development. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Thermoset precursor

    International Nuclear Information System (INIS)

    Yamamoto, Y.

    1983-04-01

    This invention pertains to a distinctive thermoset precursor which is prepared by mixing a resin composition (A) which can be hardened by ionizing radiation, and a resin composition (B) which can be hardened by heat but cannot be hardened by, or is resistant to, ionizing radiation, and by coating or impregnating a molding or other substrate with a sheet or film of this mixture and irradiating this with an ionizing radiation. The principal components of composition (A) and (B) can be the following: (1) an acrylate or methacrylate and an epoxy resin and an epoxy resin hardener; (2) an unsaturated polyester resin and epoxy resin and an epoxy resin hardener; (3) a diacrylate or dimethacrylate or polyethylene glycol and an epoxy resin; (4) an epoxy acrylates or epoxy methacrylate obtained by the addition reaction of epoxy resin and acrylic or methacrylic acid

  8. Mood instability as a precursor to depressive illness: A prospective and mediational analysis.

    Science.gov (United States)

    Marwaha, Steven; Balbuena, Lloyd; Winsper, Catherine; Bowen, Rudy

    2015-06-01

    Mood instability levels are high in depression, but temporal precedence and potential mechanisms are unknown. Hypotheses tested were as follows: (1) mood instability is associated with depression cross-sectionally, (2) mood instability predicts new onset and maintenance of depression prospectively and (3) the mood instability and depression link are mediated by sleep problems, alcohol abuse and life events. Data from the National Psychiatric Morbidity Survey 2000 at baseline (N = 8580) and 18-month follow-up (N = 2413) were used. Regression modeling controlling for socio-demographic factors, anxiety and hypomanic mood was conducted. Multiple mediational analyses were used to test our conceptual path model. Mood instability was associated with depression cross-sectionally (odds ratio: 5.28; 95% confidence interval: [3.67, 7.59]; p depression inception (odds ratio: 2.43; 95% confidence interval: [1.03-5.76]; p = 0.042) after controlling for important confounders. Mood instability did not predict maintenance of depression. Sleep difficulties and severe problems with close friends and family significantly mediated the link between mood instability and new onset depression (23.05% and 6.19% of the link, respectively). Alcohol abuse and divorce were not important mediators in the model. Mood instability is a precursor of a depressive episode, predicting its onset. Difficulties in sleep are a significant part of the pathway. Interventions targeting mood instability and sleep problems have the potential to reduce the risk of depression. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  9. Tricyanomethane and Its Ketenimine Tautomer: Generation from Different Precursors and Analysis in Solution, Argon Matrix, and as a Single Crystal.

    Science.gov (United States)

    Banert, Klaus; Chityala, Madhu; Hagedorn, Manfred; Beckers, Helmut; Stüker, Tony; Riedel, Sebastian; Rüffer, Tobias; Lang, Heinrich

    2017-08-01

    Solutions of azidomethylidenemalononitrile were photolyzed at low temperatures to produce the corresponding 2H-azirine and tricyanomethane, which were analyzed by low-temperature NMR spectroscopy. The latter product was also observed after short thermolysis of the azide precursor in solution whereas irradiation of the azide isolated in an argon matrix did not lead to tricyanomethane, but to unequivocal detection of the tautomeric ketenimine by IR spectroscopy for the first time. When the long-known "aquoethereal" greenish phase generated from potassium tricyanomethanide, dilute sulfuric acid, and diethyl ether was rapidly evaporated and sublimed, a mixture of hydronium tricyanomethanide and tricyanomethane was formed instead of the previously claimed ketenimine tautomer. Under special conditions of sublimation, single crystals of tricyanomethane could be isolated, which enabled the analysis of the molecular structure by X-ray diffraction. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  11. Computational analysis of the atomic size effect in bulk metallic glasses and their liquid precursors

    International Nuclear Information System (INIS)

    Kokotin, V.; Hermann, H.

    2008-01-01

    The atomic size effect and its consequences for the ability of multicomponent liquid alloys to form bulk metallic glasses are analyzed in terms of the generalized Bernal's model for liquids, following the hypothesis that maximum density in the liquid state improves the glass-forming ability. The maximum density that can be achieved in the liquid state is studied in the 2(N-1) dimensional parameter space of N-component systems. Computer simulations reveal that the size ratio of largest to smallest atoms are most relevant for achieving the maximum packing for N = 3-5, whereas the number of components plays a minor role. At small size ratio, the maximum packing density can be achieved by different atomic size distributions, whereas for medium size ratios the maximum density is always correlated to a concave size distribution. The relationship of the results to Miracle's efficient cluster packing model is also discussed

  12. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  13. Quantitative analysis of mineral powders by DRIFTS: Determination of SrCO3 in superconductor precursor powders

    DEFF Research Database (Denmark)

    Bak, J.; Kindl, B.

    1997-01-01

    An application of diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) has been demonstrated to be able to determine small concentrations, down to the 100-ppm level, of carbonates in powdery superconductor (SPC) precursor samples, The detection of carbonates in SPC precursor powders...

  14. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  15. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  16. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  17. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  18. Precursors of nitrogenous disinfection by-products in drinking water––A critical review and analysis

    International Nuclear Information System (INIS)

    Bond, Tom; Templeton, Michael R.; Graham, Nigel

    2012-01-01

    Highlights: ► The proportion of N-DBP formation attributable to specific precursors was calculated. ► Precursor concentrations are typically insufficient to account for observed N-DBP formation, except CNX and NDMA. ► Amino acid precursors are easier to remove during water treatment than suggested by laboratory studies. - Abstract: In recent years research into the formation of nitrogenous disinfection by-products (N-DBPs) in drinking water – including N-nitrosodimethylamine (NDMA), the haloacetonitriles (HANs), haloacetamides (HAcAms), cyanogen halides (CNX) and halonitromethanes (HNMs) – has proliferated. This is partly due to their high reported toxicity of N-DBPs. In this review paper information about the formation yields of N-DBPs from model precursors, and about environmental precursor occurrence, has been employed to assess the amount of N-DBP formation that is attributable to known precursors. It was calculated that for HANs and HAcAms, the concentrations of known precursors – mainly free amino acids are insufficient to account for the observed concentrations of these N-DBP groups. However, at least in some waters, a significant proportion of CNX and NDMA formation can be explained by known precursors. Identified N-DBP precursors tend to be of low molecular weight and low electrostatic charge relative to bulk natural organic matter (NOM). This makes them recalcitrant to removal by water treatment processes, notably coagulation, as confirmed by a number of bench-scale studies. However, amino acids have been found to be easier to remove during water treatment than would be suggested by the known molecular properties of the individual free amino acids.

  19. Frequency analysis of cytotoxic T lymphocyte precursors in search for donors in bone marrow transplantation

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Matejkova, E.; Korinkova, P.; Lukasova, M.; Stary, J.

    1995-01-01

    The usefulness of cytotoxic T lymphocytes (CTLp) frequency analysis in the search for donors in bone marrow transplantation was studied. The frequency of anti-recipient CTLp was approached by limiting dilution assay in HLA matched unrelated, HLA partially matched related and HLA genotypically identical donors. The majority of patients examined were affected with different hematological malignancies. Allo-reactive CTLp recognizing non-HLA gene products were not detected in pre-transplant examination of two pairs of HLA identical siblings. However, an increase incidence of allo-specific CTLp was identified in HLA matched mixed lymphocyte culture (MLC) negative unrelated pairs. Thus, CTLp assay allowed to the residual Class I incompatibilities that remained hidden in standard serotyping. In two matched unrelated pairs with high pretranslant CTLp frequency the severe acute graft-versus-host diseases developed after bone marrow transplantation. Examination of other relatives in patients lacking an HLA identical sibling showed the importance of Class I incompatibility for CTLp generation as well. The lack of correlation between CTLp frequency and HLA-D disparity could suggest that Class II antigens do not participate in CTLp induction. With one exception we had good correlation between MLC and DNA analysis of Class II antigens demonstrating that MLC gives interpretable results even in unrelated pairs. Our results demonstrate the significance of CTLp frequency assay in detection of residual Class I incompatibilities in matched unrelated pairs and in assessment of Class I compatibility in related pairs. For that it should be used in the final selection of bone marrow transplantation donors. (author)

  20. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  1. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  2. Structural analysis and characterization of layer perovskite oxynitrides made from Dion-Jacobson oxide precursors

    International Nuclear Information System (INIS)

    Schottenfeld, Joshua A.; Benesi, Alan J.; Stephens, Peter W.; Chen, Gugang; Eklund, Peter C.; Mallouk, Thomas E.

    2005-01-01

    A three-layer oxynitride Ruddlesden-Popper phase Rb 1+x Ca 2 Nb 3 O 10-x N x .yH 2 O (x=0.7-0.8, y=0.4-0.6) was synthesized by ammonialysis at 800 o C from the Dion-Jacobson phase RbCa 2 Nb 3 O 10 in the presence of Rb 2 CO 3 . Incorporation of nitrogen into the layer perovskite structure was confirmed by XPS, combustion analysis, and MAS NMR. The water content was determined by thermal gravimetric analysis and the rubidium content by ICP-MS. A similar layered perovskite interconversion occurred in the two-layer Dion-Jacobson oxide RbLaNb 2 O 7 to yield Rb 1+x LaNb 2 O 7-x N x .yH 2 O (x=0.7-0.8, y=0.5-1.0). Both compounds were air- and moisture-sensitive, with rapid loss of nitrogen by oxidation and hydrolysis reactions. The structure of the three-layer oxynitride Rb 1.7 Ca 2 Nb 3 O 9.3 N 0.7 .0.5H 2 O was solved in space group P4/mmm with a=3.887(3) and c=18.65(1)A, by Rietveld refinement of X-ray powder diffraction data. The two-layer oxynitride structure Rb 1.8 LaNb 2 O 6.3 N 0.7 .1.0H 2 O was also determined in space group P4/mmm with a=3.934(2) and c=14.697(2)A. GSAS refinement of synchrotron X-ray powder diffraction data showed that the water molecules were intercalated between a double layer of Rb+ ions in both the two- and three-layer Ruddlesden-Popper structures. Optical band gaps were measured by diffuse reflectance UV-vis for both materials. An indirect band gap of 2.51eV and a direct band gap of 2.99eV were found for the three-layer compound, while an indirect band gap of 2.29eV and a direct band gap of 2.84eV were measured for the two-layer compound. Photocatalytic activity tests of the three-layer compound under 380nm pass filtered light with AgNO 3 as a sacrificial electron acceptor gave a quantum yield of 0.025% for oxygen evolution

  3. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  4. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  5. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  6. Limiting dilution analysis for precursor frequency of Con A-responsive mouse Thy-1+ dendritic epidermal cells

    International Nuclear Information System (INIS)

    Takashima, A.; Bergstresser, P.R.; Nixon-Fulton, J.L.; Tigelaar, R.E.

    1986-01-01

    The authors have recently demonstrated in vitro proliferation of mouse Thy-1 + dendritic epidermal cells (EC) to Con A and IL-2. The purpose of the present study was to utilize limiting dilution analysis to determine the precursor frequency (PF) of Con A-responsive cells within EC enriched by Isolymph centrifugation for Thy-1 + cells (IEC). AKR IEC were cultured in 96 well U-plates (25-75 cells/well) with 2 μg/ml Con A and 2 x 10 5 irradiated (1600 R) AKR spleen cells/well. Cultures were harvested after 7-21 days following 3 H-thymidine pulsing. Results indicated a PF within IEC of 1.5-4.5%. Inclusion of 10 U/ml IL-2 enhanced significantly the proliferation in positive wells but did not alter this PF. In AKR mice, monoclonal antibody 20-10-5S has been shown to react with Thy-1 + EC, but not with peripheral T cells. FACS purification of IEC using 20-10-5S indicated that Con A responsiveness resides exclusively within the 20-10-5S + population. The PF of Con A-responsive Thy-1 + EC was calculated by dividing the PF of IEC by the fraction of 20-10-5S + cells (13-30%) in the IEC suspension. A significant proportion of Thy-1 + EC (∼12%) were found to possess Con A proliferative capacity. These studies will facilitate analysis at a clonal level of possible functional and phenotypic heterogeneity within the Thy-1 + EC population

  7. Multiscale analysis: a way to investigate laser damage precursors in materials for high power applications at nanosecond pulse duration

    Science.gov (United States)

    Natoli, J. Y.; Wagner, F.; Ciapponi, A.; Capoulade, J.; Gallais, L.; Commandré, M.

    2010-11-01

    The mechanism of laser induced damage in optical materials under high power nanosecond laser irradiation is commonly attributed to the presence of precursor centers. Depending on material and laser source, the precursors could have different origins. Some of them are clearly extrinsic, such as impurities or structural defects linked to the fabrication conditions. In most cases the center size ranging from sub-micrometer to nanometer scale does not permit an easy detection by optical techniques before irradiation. Most often, only a post mortem observation of optics permits to proof the local origin of breakdown. Multi-scale analyzes by changing irradiation beam size have been performed to investigate the density, size and nature of laser damage precursors. Destructive methods such as raster scan, laser damage probability plot and morphology studies permit to deduce the precursor densities. Another experimental way to get information on nature of precursors is to use non destructive methods such as photoluminescence and absorption measurements. The destructive and non destructive multiscale studies are also motivated for practical reasons. Indeed LIDT studies of large optics as those used in LMJ or NIF projects are commonly performed on small samples and with table top lasers whose characteristics change from one to another. In these conditions, it is necessary to know exactly the influence of the different experimental parameters and overall the spot size effect on the final data. In this paper, we present recent developments in multiscale characterization and results obtained on optical coatings (surface case) and KDP crystal (bulk case).

  8. Analysis of coke precursor on catalyst and study on regeneration of catalyst in upgrading of bio-oil

    International Nuclear Information System (INIS)

    Guo, Xiaoya; Zheng, Yong; Zhang, Baohua; Chen, Jinyang

    2009-01-01

    Catalyst HZSM-5 was used in bio-oil catalytic cracking upgrading. The precursor of coke on the catalyst was analyzed by means of TGA, FTIR and C13 NMR. Precursors of coke deposited in the pore of the molecular sieve were mainly aromatic hydrocarbon with the boiling point range from 350 o C to 650 o C. Those on the outer surface of the pellet precursor were identified as saturated aliphatic hydrocarbons with the boiling point below 200 o C. The activity of HZSM-5 was studied after regeneration. In terms of yield of organic distillate and formation rate of coke, results showed that catalytic activity change moderately during the first three times of regeneration.

  9. Quantitative risk trends deriving from PSA-based event analyses. Analysis of results from U.S.NRC's accident sequence precursor program

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2004-01-01

    The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)

  10. Single molecule analysis of c-myb alternative splicing reveals novel classifiers for precursor B-ALL.

    Directory of Open Access Journals (Sweden)

    Ye E Zhou

    Full Text Available The c-Myb transcription factor, a key regulator of proliferation and differentiation in hematopoietic and other cell types, has an N-terminal DNA binding domain and a large C-terminal domain responsible for transcriptional activation, negative regulation and determining target gene specificity. Overexpression and rearrangement of the c-myb gene (MYB has been reported in some patients with leukemias and other types of cancers, implicating activated alleles of c-myb in the development of human tumors. Alternative RNA splicing can produce variants of c-myb with qualitatively distinct transcriptional activities that may be involved in transformation and leukemogenesis. Here, by performing a detailed, single molecule assay we found that c-myb alternative RNA splicing was elevated and much more complex in leukemia samples than in cell lines or CD34+ hematopoietic progenitor cells from normal donors. The results revealed that leukemia samples express more than 60 different c-myb splice variants, most of which have multiple alternative splicing events and were not detectable by conventional microarray or PCR approaches. For example, the single molecule assay detected 21 and 22 splice variants containing the 9B and 9S exons, respectively, most of which encoded unexpected variant forms of c-Myb protein. Furthermore, the detailed analysis identified some splice variants whose expression correlated with poor survival in a small cohort of precursor B-ALL samples. Our findings indicate that single molecule assays can reveal complexities in c-myb alternative splicing that have potential as novel biomarkers and could help explain the role of c-Myb variants in the development of human leukemia.

  11. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  12. Buffered Communication Analysis in Distributed Multiparty Sessions

    Science.gov (United States)

    Deniélou, Pierre-Malo; Yoshida, Nobuko

    Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

  13. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  14. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  15. A simplified calculation procedure for mass isotopomer distribution analysis (MIDA) based on multiple linear regression.

    Science.gov (United States)

    Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio

    2016-10-01

    We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two 13 C atoms ( 13 C 2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of 13 C 2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% 13 C 2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  17. Development of pair distribution function analysis

    International Nuclear Information System (INIS)

    Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

    1996-01-01

    This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO 2 planes of high-T c superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-T c superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF

  18. Corroded scale analysis from water distribution pipes

    Directory of Open Access Journals (Sweden)

    Rajaković-Ognjanović Vladana N.

    2011-01-01

    Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains

  19. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  20. silicon bipolar distributed oscillator design and analysis

    African Journals Online (AJOL)

    digital and analogue market, wired or wireless is making it necessary to operate ... is generally high; this additional power is supplied by the eternal dc source. ... distributed oscillator consists of a pair of transmission lines with characteristic ...

  1. Distributed energy store railguns experiment and analysis

    International Nuclear Information System (INIS)

    Holland, L.D.

    1984-01-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. The distributed energy store railgun used multiple current sources connected to the rails of a railgun at points distributed along the bore. These current sources (energy stores) are turned on in sequence as the projectile moves down the bore so that current is fed to the railgun from behind the armature. In this system the length of the rails that carry the full armature current is less than the total length of the railgun. If a sufficient number of energy stores is used, this removes the limitation on the length of a railgun. An additional feature of distributed energy store type railguns is that they can be designed to maintain a constant pressure on the projectile being accelerated. A distributed energy store railgun was constructed and successfully operated. In addition to this first demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed

  2. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  3. Analysis of mixed mode microwave distribution manifolds

    International Nuclear Information System (INIS)

    White, T.L.

    1982-09-01

    The 28-GHz microwave distribution manifold used in the ELMO Bumpy Torus-Scale (EBT-S) experiments consists of a toroidal metallic cavity, whose dimensions are much greater than a wavelength, fed by a source of microwave power. Equalization of the mixed mode power distribution ot the 24 cavities of EBT-S is accomplished by empirically adjusting the coupling irises which are equally spaced around the manifold. The performance of the manifold to date has been very good, yet no analytical models exist for optimizing manifold transmission efficiency or for scaling this technology to the EBT-P manifold design. The present report develops a general model for mixed mode microwave distribution manifolds based on isotropic plane wave sources of varying amplitudes that are distributed toroidally around the manifold. The calculated manifold transmission efficiency for the most recent EBT-S coupling iris modification is 90%. This agrees with the average measured transmission efficiency. Also, the model predicts the coupling iris areas required to balance the distribution of microwave power while maximizing transmission efficiency, and losses in waveguide feeds connecting the irises to the cavities of EBT are calculated using an approach similar to the calculation of mainfold losses. The model will be used to evaluate EBT-P manifold designs

  4. The mining of toxin-like polypeptides from EST database by single residue distribution analysis.

    Science.gov (United States)

    Kozlov, Sergey; Grishin, Eugene

    2011-01-31

    Novel high throughput sequencing technologies require permanent development of bioinformatics data processing methods. Among them, rapid and reliable identification of encoded proteins plays a pivotal role. To search for particular protein families, the amino acid sequence motifs suitable for selective screening of nucleotide sequence databases may be used. In this work, we suggest a novel method for simplified representation of protein amino acid sequences named Single Residue Distribution Analysis, which is applicable both for homology search and database screening. Using the procedure developed, a search for amino acid sequence motifs in sea anemone polypeptides was performed, and 14 different motifs with broad and low specificity were discriminated. The adequacy of motifs for mining toxin-like sequences was confirmed by their ability to identify 100% toxin-like anemone polypeptides in the reference polypeptide database. The employment of novel motifs for the search of polypeptide toxins in Anemonia viridis EST dataset allowed us to identify 89 putative toxin precursors. The translated and modified ESTs were scanned using a special algorithm. In addition to direct comparison with the motifs developed, the putative signal peptides were predicted and homology with known structures was examined. The suggested method may be used to retrieve structures of interest from the EST databases using simple amino acid sequence motifs as templates. The efficiency of the procedure for directed search of polypeptides is higher than that of most currently used methods. Analysis of 39939 ESTs of sea anemone Anemonia viridis resulted in identification of five protein precursors of earlier described toxins, discovery of 43 novel polypeptide toxins, and prediction of 39 putative polypeptide toxin sequences. In addition, two precursors of novel peptides presumably displaying neuronal function were disclosed.

  5. The mining of toxin-like polypeptides from EST database by single residue distribution analysis

    Directory of Open Access Journals (Sweden)

    Grishin Eugene

    2011-01-01

    Full Text Available Abstract Background Novel high throughput sequencing technologies require permanent development of bioinformatics data processing methods. Among them, rapid and reliable identification of encoded proteins plays a pivotal role. To search for particular protein families, the amino acid sequence motifs suitable for selective screening of nucleotide sequence databases may be used. In this work, we suggest a novel method for simplified representation of protein amino acid sequences named Single Residue Distribution Analysis, which is applicable both for homology search and database screening. Results Using the procedure developed, a search for amino acid sequence motifs in sea anemone polypeptides was performed, and 14 different motifs with broad and low specificity were discriminated. The adequacy of motifs for mining toxin-like sequences was confirmed by their ability to identify 100% toxin-like anemone polypeptides in the reference polypeptide database. The employment of novel motifs for the search of polypeptide toxins in Anemonia viridis EST dataset allowed us to identify 89 putative toxin precursors. The translated and modified ESTs were scanned using a special algorithm. In addition to direct comparison with the motifs developed, the putative signal peptides were predicted and homology with known structures was examined. Conclusions The suggested method may be used to retrieve structures of interest from the EST databases using simple amino acid sequence motifs as templates. The efficiency of the procedure for directed search of polypeptides is higher than that of most currently used methods. Analysis of 39939 ESTs of sea anemone Anemonia viridis resulted in identification of five protein precursors of earlier described toxins, discovery of 43 novel polypeptide toxins, and prediction of 39 putative polypeptide toxin sequences. In addition, two precursors of novel peptides presumably displaying neuronal function were disclosed.

  6. Distributed energy store railguns experiment and analysis

    Science.gov (United States)

    Holland, L. D.

    1984-02-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. A distributed energy storage railgun was constructed and successfully operated. In addition to this demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed. A simple simulation of the railgun system based on this model, but ignoring frictional drag, was compared with the experimental results. During the process of comparing results from the simulation and the experiment, the effect of significant frictional drag of the projectile on the sidewalls of the bore was observed.

  7. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  8. Field distribution analysis in deflecting structures

    International Nuclear Information System (INIS)

    Paramonov, V.V.

    2013-02-01

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE 1 and HM 1 . The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  9. Statistical analysis of partial reduced width distributions

    International Nuclear Information System (INIS)

    Tran Quoc Thuong.

    1973-01-01

    The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr

  10. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  11. Distributed crack analysis of ceramic inlays

    NARCIS (Netherlands)

    Peters, M.C.R.B.; Vree, de J.H.P.; Brekelmans, W.A.M.

    1993-01-01

    In all-ceramic restorations, crack formation and propagation phenomena are of major concern, since they may result in intra-oral fracture. The objective of this study was calculation of damage in porcelain MOD inlays by utilization of a finite-element (FE) implementation of the distributed crack

  12. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA; GENTON, MARC G.; LISEO, BRUNERO

    2012-01-01

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student's t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric

  13. Analysis of refrigerant mal-distribution

    DEFF Research Database (Denmark)

    Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...

  14. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  15. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  16. Economic analysis of efficient distribution transformer trends

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  17. Induction and Analysis of the Alkaloid Mitragynine Content of a Mitragyna speciosa Suspension Culture System upon Elicitation and Precursor Feeding

    Directory of Open Access Journals (Sweden)

    Nor Nahazima Mohamad Zuldin

    2013-01-01

    Full Text Available This study aimed to determine the effects of different concentrations and combinations of the phytohormones 2,4-dichlorophenoxy acetic acid (2,4-D, kinetin, 6-benzylaminopurine (BAP, and 1-naphthaleneacetic acid (NAA on callus induction and to demonstrate the role of elicitors and exogenous precursors on the production of mitragynine in a Mitragyna speciosa suspension culture. The best callus induction was achieved from petiole explants cultured on WPM that was supplemented with 4 mg L−1 2, 4-D (70.83%. Calli were transferred to liquid media and agitated on rotary shakers to establish Mitragyna speciosa cell suspension cultures. The optimum settled cell volume was achieved in the presence of WPM that contained 3 mg L−1 2,4-D and 3% sucrose (9.47±0.4667 mL. The treatment of cultures with different concentrations of yeast extract and salicylic acid for different inoculation periods revealed that the highest mitragynine content as determined by HPLC was achieved from the culture treated with 250 mg L−1 yeast extract (9.275±0.082 mg L−1 that was harvested on day 6 of culturing; salicylic acid showed low mitragynine content in all concentrations used. Tryptophan and loganin were used as exogenous precursors; the highest level of mitragynine production was achieved in cultures treated with 3 μM tryptophan and harvested at 6 days (13.226±1.98 mg L−1.

  18. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  19. Precursor incident program at EDF

    International Nuclear Information System (INIS)

    Fourest, B.; Maliverney, B.; Rozenholc, M.; Piovesan, C.

    1998-01-01

    The precursor program was started by EDF in 1994, after an investigation of the US NRC's Accident Sequence Precursor Program. Since then, reported operational events identified as Safety Outstanding Events have been analyzed whenever possible using probabilistic methods based on PSAs. Analysis provides an estimate of the remaining protection against core damage at the time the incident occurred. Measuring the incidents' severity enables to detect incidents important regarding safety. Moreover, the most efficient feedback actions can be derived from the main accident sequences identified through the analysis. Therefore, incident probabilistic analysis provides a way to assess priorities in terms of treatment and resource allocation, and so, to implement countermeasures preventing further occurrence and development of the most significant incidents. As some incidents cannot be analyzed using this method, probabilistic analysis can only be one among the methods used to assess the nuclear power plants' safety level. Nevertheless, it provides an interesting complement to classical methods of deterministic studies. (author)

  20. The Distinction of Amyloid-β Protein Precursor (AβPP) Ratio in Platelet Between Alzheimer's Disease Patients and Controls: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Shi, Yachen; Gu, Lihua; Alsharif, Abdul Azeez; Zhang, Zhijun

    2017-01-01

    To systematically assess the clinical significance of platelet amyloid-β protein precursor (AβPP) ratio between Alzheimer's disease (AD) patients and controls. 14 articles were selected in this analysis by search of databases including PubMed and Web of Science up to December 2016. Random effects models were used to calculate the standardized mean difference (SMD). Subgroup analyses were used to detect the cause of heterogeneity. The result showed a significant drop in platelet AβPP ratio in AD patients compared to controls [SMD: -1.871; 95% CI: (-2.33, -1.41); p analysis revealed races or the quality of studies may be the cause of high heterogeneity. This meta-analysis concluded that there is a close association between platelet AβPP ratio and AD. It is necessary to design a sizable sample study to further support that platelet AβPP ratio can be a biomarker of AD.

  1. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  2. Analysis of the functional compatibility of SIV capsid sequences in the context of the FIV gag precursor.

    Directory of Open Access Journals (Sweden)

    César A Ovejero

    Full Text Available The formation of immature lentiviral particles is dependent on the multimerization of the Gag polyprotein at the plasma membrane of the infected cells. One key player in the virus assembly process is the capsid (CA domain of Gag, which establishes the protein-protein interactions that give rise to the hexagonal lattice of Gag molecules in the immature virion. To gain a better understanding of the functional equivalence between the CA proteins of simian and feline immunodeficiency viruses (SIV and FIV, respectively, we generated a series of chimeric FIV Gag proteins in which the CA-coding region was partially or totally replaced by its SIV counterpart. All the FIV Gag chimeras were found to be assembly-defective; however, all of them are able to interact with wild-type SIV Gag and be recruited into extracellular virus-like particles, regardless of the SIV CA sequences present in the chimeric FIV Gag. The results presented here markedly contrast with our previous findings showing that chimeric SIVs carrying FIV CA-derived sequences are assembly-competent. Overall, our data support the notion that although the SIV and FIV CA proteins share 51% amino acid sequence similarity and exhibit a similar organization, i.e., an N-terminal domain joined by a flexible linker to a C-terminal domain, their functional exchange between these different lentiviruses is strictly dependent on the context of the recipient Gag precursor.

  3. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  4. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  5. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  6. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  7. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  8. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  9. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  10. Database searching and accounting of multiplexed precursor and product ion spectra from the data independent analysis of simple and complex peptide mixtures.

    Science.gov (United States)

    Li, Guo-Zhong; Vissers, Johannes P C; Silva, Jeffrey C; Golick, Dan; Gorenstein, Marc V; Geromanos, Scott J

    2009-03-01

    A novel database search algorithm is presented for the qualitative identification of proteins over a wide dynamic range, both in simple and complex biological samples. The algorithm has been designed for the analysis of data originating from data independent acquisitions, whereby multiple precursor ions are fragmented simultaneously. Measurements used by the algorithm include retention time, ion intensities, charge state, and accurate masses on both precursor and product ions from LC-MS data. The search algorithm uses an iterative process whereby each iteration incrementally increases the selectivity, specificity, and sensitivity of the overall strategy. Increased specificity is obtained by utilizing a subset database search approach, whereby for each subsequent stage of the search, only those peptides from securely identified proteins are queried. Tentative peptide and protein identifications are ranked and scored by their relative correlation to a number of models of known and empirically derived physicochemical attributes of proteins and peptides. In addition, the algorithm utilizes decoy database techniques for automatically determining the false positive identification rates. The search algorithm has been tested by comparing the search results from a four-protein mixture, the same four-protein mixture spiked into a complex biological background, and a variety of other "system" type protein digest mixtures. The method was validated independently by data dependent methods, while concurrently relying on replication and selectivity. Comparisons were also performed with other commercially and publicly available peptide fragmentation search algorithms. The presented results demonstrate the ability to correctly identify peptides and proteins from data independent acquisition strategies with high sensitivity and specificity. They also illustrate a more comprehensive analysis of the samples studied; providing approximately 20% more protein identifications, compared to

  11. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  12. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  13. Analysis of the influence of two different milling processes in the properties of precursor powder and [Beta]-TCP cement

    International Nuclear Information System (INIS)

    Cardoso, H.A.I.; Pereira, C.H.R.; Zavaglia, C.A.C.; Motisuke, M.

    2011-01-01

    There are several characteristics that put calcium phosphate cements in evidence, like its bioactivity and in vivo resorption. The influence of two milling processes in the morphological properties of the [beta]-tricalcium phosphate powder, [beta]-TCP, and in the mechanical properties of the cement were analyzed. The powder was obtained by solid state reaction of CaCO_3 and CaHPO_4 at 1050 ° C. It showed high phase purity and absence of toxic elements. The powder was processed in ball mill (A) and high-energy vibratory mill (B), with posterior analyze by SEM and particle size distribution. The powders showed different average and distribution of grain size. Finally, the cement obtained by the process (B) showed values of axial tensile strength significantly greater than that obtained by the process (A). The milling process (B) is much more efficient than the process (A). (author)

  14. Analysis of Precursors Prior to Rock Burst in Granite Tunnel Using Acoustic Emission and Far Infrared Monitoring

    OpenAIRE

    Liang, Zhengzhao; Liu, Xiangxin; Zhang, Yanbo; Tang, Chunan

    2013-01-01

    To understand the physical mechanism of the anomalous behaviors observed prior to rock burst, the acoustic emission (AE) and far infrared (FIR) techniques were applied to monitor the progressive failure of a rock tunnel model subjected to biaxial stresses. Images of fracturing process, temperature changes of the tunnel, and spatiotemporal serials of acoustic emission were simultaneously recorded during deformation of the model. The b-value derived from the amplitude distribution data of AE wa...

  15. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  16. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  17. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  18. Simulation and analysis of the soot particle size distribution in a turbulent nonpremixed flame

    KAUST Repository

    Lucchesi, Marco; Abdelgadir, Ahmed Gamaleldin; Attili, Antonio; Bisetti, Fabrizio

    2017-01-01

    to the simulation of soot formation and growth in simplified configurations featuring a constant concentration of soot precursors and the evolution of the size distribution in time is found to depend on the intensity of the nucleation rate. Higher nucleation rates

  19. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  20. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  1. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  2. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  3. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  4. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  5. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  6. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  7. Detection of ULF electromagnetic emissions as a precursor to an earthquake in China with an improved polarization analysis

    Directory of Open Access Journals (Sweden)

    Y. Ida

    2008-07-01

    Full Text Available An improved analysis of polarization (as the ratio of vertical magnetic field component to the horizontal one has been developed, and applied to the approximately four years data (from 1 March 2003 to 31 December 2006 observed at Kashi station in China. It is concluded that the polarization ratio has exhibited an apparent increase only just before the earthquake on 1 September 2003 (magnitude = 6.1 and epicentral distance of 116 km.

  8. Characterization of Arabidopsis FPS isozymes and FPS gene expression analysis provide insight into the biosynthesis of isoprenoid precursors in seeds.

    Science.gov (United States)

    Keim, Verónica; Manzano, David; Fernández, Francisco J; Closa, Marta; Andrade, Paola; Caudepón, Daniel; Bortolotti, Cristina; Vega, M Cristina; Arró, Montserrat; Ferrer, Albert

    2012-01-01

    Arabidopsis thaliana contains two genes encoding farnesyl diphosphate (FPP) synthase (FPS), the prenyl diphoshate synthase that catalyzes the synthesis of FPP from isopentenyl diphosphate (IPP) and dimethylallyl diphosphate (DMAPP). In this study, we provide evidence that the two Arabidopsis short FPS isozymes FPS1S and FPS2 localize to the cytosol. Both enzymes were expressed in E. coli, purified and biochemically characterized. Despite FPS1S and FPS2 share more than 90% amino acid sequence identity, FPS2 was found to be more efficient as a catalyst, more sensitive to the inhibitory effect of NaCl, and more resistant to thermal inactivation than FPS1S. Homology modelling for FPS1S and FPS2 and analysis of the amino acid differences between the two enzymes revealed an increase in surface polarity and a greater capacity to form surface salt bridges of FPS2 compared to FPS1S. These factors most likely account for the enhanced thermostability of FPS2. Expression analysis of FPS::GUS genes in seeds showed that FPS1 and FPS2 display complementary patterns of expression particularly at late stages of seed development, which suggests that Arabidopsis seeds have two spatially segregated sources of FPP. Functional complementation studies of the Arabidopsis fps2 knockout mutant seed phenotypes demonstrated that under normal conditions FPS1S and FPS2 are functionally interchangeable. A putative role for FPS2 in maintaining seed germination capacity under adverse environmental conditions is discussed.

  9. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    Science.gov (United States)

    Zhang, Ying; Moges, Semu; Block, Paul

    2018-01-01

    Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  10. Characterization of Arabidopsis FPS isozymes and FPS gene expression analysis provide insight into the biosynthesis of isoprenoid precursors in seeds.

    Directory of Open Access Journals (Sweden)

    Verónica Keim

    Full Text Available Arabidopsis thaliana contains two genes encoding farnesyl diphosphate (FPP synthase (FPS, the prenyl diphoshate synthase that catalyzes the synthesis of FPP from isopentenyl diphosphate (IPP and dimethylallyl diphosphate (DMAPP. In this study, we provide evidence that the two Arabidopsis short FPS isozymes FPS1S and FPS2 localize to the cytosol. Both enzymes were expressed in E. coli, purified and biochemically characterized. Despite FPS1S and FPS2 share more than 90% amino acid sequence identity, FPS2 was found to be more efficient as a catalyst, more sensitive to the inhibitory effect of NaCl, and more resistant to thermal inactivation than FPS1S. Homology modelling for FPS1S and FPS2 and analysis of the amino acid differences between the two enzymes revealed an increase in surface polarity and a greater capacity to form surface salt bridges of FPS2 compared to FPS1S. These factors most likely account for the enhanced thermostability of FPS2. Expression analysis of FPS::GUS genes in seeds showed that FPS1 and FPS2 display complementary patterns of expression particularly at late stages of seed development, which suggests that Arabidopsis seeds have two spatially segregated sources of FPP. Functional complementation studies of the Arabidopsis fps2 knockout mutant seed phenotypes demonstrated that under normal conditions FPS1S and FPS2 are functionally interchangeable. A putative role for FPS2 in maintaining seed germination capacity under adverse environmental conditions is discussed.

  11. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  12. Understanding the Formation of Kinetically Stable Compounds and the Development of Thin Film Pair Distribution Function Analysis

    Science.gov (United States)

    Wood, Suzannah Rebecca

    Navigating the synthesis landscape poses many challenges when developing novel solid state materials. Advancements in both synthesis and characterization are necessary to facilitate the targeting of specific materials. This dissertation discusses the formation of chalcogenide heterostructures and their properties in the first part and the development of thin film pair distribution function analysis (tfPDF) in the second part. The heterostructures were formed by the self-assembly of designed precursors deposited by physical vapor deposition in a modulated elemental reactants approach, which provides the control and predictability to synthesis. Specifically, a series of (BiSe)1+delta(TiSe2) n, where n = 2,3,&4, were synthesized to explore the extent of charge transfer from the BiSe to TiSe2 layers. To further explore the role Bi plays in charge donation, a family of structurally similar compounds, (Bix Sn1-xSe)1+deltaTiSe2, where 0≥x≥1, were synthesized and characterized. Electrical measurements show doping efficiency decreases as x increases, correlated with the structural distortion and the formation of periodic antiphase boundaries containing Bi-Bi pairs. The first heterostructures composed of three unique structural types were synthesized and Bi2Se3 layer thickness was used to tune electrical properties and further explore charge transfer. To better understand the potential energy landscape on which these kinetically stable compounds exist, two investigations were undertaken. The first was a study of the formation and subsequent decomposition of [(BiSe)1+delta]n(TiSe2)n compounds, where n= 2&3, the second an investigation of precursor structure for thermodynamically stable FeSb2 and kinetically stable FeSb3. The second section describes the development of thin film pair distribution function analysis, a technique in which total scattering data for pair distribution function (PDF) analysis is obtained from thin films, suitable for local structure analysis

  13. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2018-01-01

    Full Text Available Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS values of up to 0.5 and 33 %, respectively. The general skill (after bias correction of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  14. Identification of key factors regulating self-renewal and differentiation in EML hematopoietic precursor cells by RNA-sequencing analysis.

    Science.gov (United States)

    Zong, Shan; Deng, Shuyun; Chen, Kenian; Wu, Jia Qian

    2014-11-11

    Hematopoietic stem cells (HSCs) are used clinically for transplantation treatment to rebuild a patient's hematopoietic system in many diseases such as leukemia and lymphoma. Elucidating the mechanisms controlling HSCs self-renewal and differentiation is important for application of HSCs for research and clinical uses. However, it is not possible to obtain large quantity of HSCs due to their inability to proliferate in vitro. To overcome this hurdle, we used a mouse bone marrow derived cell line, the EML (Erythroid, Myeloid, and Lymphocytic) cell line, as a model system for this study. RNA-sequencing (RNA-Seq) has been increasingly used to replace microarray for gene expression studies. We report here a detailed method of using RNA-Seq technology to investigate the potential key factors in regulation of EML cell self-renewal and differentiation. The protocol provided in this paper is divided into three parts. The first part explains how to culture EML cells and separate Lin-CD34+ and Lin-CD34- cells. The second part of the protocol offers detailed procedures for total RNA preparation and the subsequent library construction for high-throughput sequencing. The last part describes the method for RNA-Seq data analysis and explains how to use the data to identify differentially expressed transcription factors between Lin-CD34+ and Lin-CD34- cells. The most significantly differentially expressed transcription factors were identified to be the potential key regulators controlling EML cell self-renewal and differentiation. In the discussion section of this paper, we highlight the key steps for successful performance of this experiment. In summary, this paper offers a method of using RNA-Seq technology to identify potential regulators of self-renewal and differentiation in EML cells. The key factors identified are subjected to downstream functional analysis in vitro and in vivo.

  15. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  16. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  17. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  18. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  19. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  20. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  1. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  2. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  3. Latitude-Time Total Electron Content Anomalies as Precursors to Japan's Large Earthquakes Associated with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Jyh-Woei Lin

    2011-01-01

    Full Text Available The goal of this study is to determine whether principal component analysis (PCA can be used to process latitude-time ionospheric TEC data on a monthly basis to identify earthquake associated TEC anomalies. PCA is applied to latitude-time (mean-of-a-month ionospheric total electron content (TEC records collected from the Japan GEONET network to detect TEC anomalies associated with 18 earthquakes in Japan (M≥6.0 from 2000 to 2005. According to the results, PCA was able to discriminate clear TEC anomalies in the months when all 18 earthquakes occurred. After reviewing months when no M≥6.0 earthquakes occurred but geomagnetic storm activity was present, it is possible that the maximal principal eigenvalues PCA returned for these 18 earthquakes indicate earthquake associated TEC anomalies. Previously PCA has been used to discriminate earthquake-associated TEC anomalies recognized by other researchers, who found that statistical association between large earthquakes and TEC anomalies could be established in the 5 days before earthquake nucleation; however, since PCA uses the characteristics of principal eigenvalues to determine earthquake related TEC anomalies, it is possible to show that such anomalies existed earlier than this 5-day statistical window.

  4. Identifying regions of strong scattering at the core-mantle boundary from analysis of PKKP precursor energy

    Science.gov (United States)

    Rost, S.; Earle, P.S.

    2010-01-01

    We detect seismic scattering from the core-mantle boundary related to the phase PKKP (PK. KP) in data from small aperture seismic arrays in India and Canada. The detection of these scattered waves in data from small aperture arrays is new and allows a better characterization of the fine-scale structure of the deep Earth especially in the southern hemisphere. Their slowness vector is determined from array processing allowing location of the heterogeneities at the core-mantle boundary using back-projection techniques through 1D Earth models. We identify strong scattering at the core-mantle boundary (CMB) beneath the Caribbean, Patagonia and the Antarctic Peninsula as well as beneath southern Africa. An analysis of the scattering regions relative to sources and receivers indicates that these regions represent areas of increased scattering likely due to increased heterogeneities close to the CMB. The 1. Hz array data used in this study is most sensitive to heterogeneity with scale lengths of about 10. km. Given the small size of the scatterers, a chemical origin of the heterogeneities is likely. By comparing the location of the fine-scale heterogeneity to geodynamical models and tomographic images, we identify different scattering mechanisms in regions related to subduction (Caribbean and Patagonia) and dense thermo chemical piles (Southern Africa). ?? 2010 Elsevier B.V.

  5. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  6. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  7. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  8. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  9. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  10. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  11. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  12. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  13. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  14. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  15. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  16. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  17. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  18. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  19. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  20. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  1. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  2. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  3. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  4. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  5. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  6. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  7. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  8. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  9. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  10. Laser damage in optical components: metrology, statistical and photo-induced analysis of precursor centres; Endommagement laser dans les composants optiques: metrologie, analyse statistique et photo-induite des sites initiateurs

    Energy Technology Data Exchange (ETDEWEB)

    Gallais, L

    2002-11-15

    This thesis deals with laser damage phenomena for nanosecond pulses, in optical components such as glasses, dielectric and metallic thin films. Firstly, a work is done on the laser damage metrology, in order to obtain accurate and reliable measurement of laser-induced damage probabilities, with a rigorous control of test parameters. Then, with the use of a specific model, we find densities of laser damage precursors in the case of bulk glasses (few tens by (100{mu}m){sup 3}) and in the case of glass surfaces (one precursor by {mu}m{sup 3}). Our analysis is associated to morphology studies by Atomic Force Microscope to discuss about precursor nature and damage process. Influence of wavelength (from 355 to 1064 nm) and cumulated shots is also studied. Simulations are performed to study initiation mechanisms on these inclusions. This work gives an estimation of complex index and size of the precursor, which permits to discuss about possible detection by non-destructive tools. (author)

  11. Mass defect filtering-oriented classification and precursor ions list-triggered high-resolution mass spectrometry analysis for the discovery of indole alkaloids from Uncaria sinensis.

    Science.gov (United States)

    Pan, Huiqin; Yang, Wenzhi; Yao, Changliang; Shen, Yao; Zhang, Yibei; Shi, Xiaojian; Yao, Shuai; Wu, Wanying; Guo, Dean

    2017-09-22

    Discovery of new natural compounds is becoming increasingly challenging because of the interference from those known and abundant components. The aim of this study is to report a dereplication strategy, by integrating mass defect filtering (MDF)-oriented novelty classification and precursor ions list (PIL)-triggered high-resolution mass spectrometry analysis, and to validate it by discovering new indole alkaloids from the medicinal herb Uncaria sinensis. Rapid chromatographic separation was achieved on a Kinetex ® EVO C18 column (<16min). An in-house MDF algorithm, developed based on the informed phytochemistry information and molecular design, could more exactly screen the target alkaloids and divide them into three novelty levels: Known (KN), Unknown-but-Predicted (UP), and Unexpected (UN). A hybrid data acquisition method, namely PIL-triggered collision-induced dissociation-MS 2 and high-energy C-trap dissociation-MS 3 with dynamic exclusion on a linear ion trap/Orbitrap mass spectrometer, facilitated the acquisition of diverse product ions sufficient for the structural elucidation of both indole alkaloids and the N-oxides. Ultimately, 158 potentially new alkaloids, including 10 UP and 108 UN, were rapidly characterized from the stem, leaf, and flower of U. sinensis. Two new alkaloid compounds thereof were successfully isolated and identified by 1D and 2D NMR analyses. The varied ring E and novel alkaloid-acylquinic acid conjugates were first reported from the whole Uncaria genus. Conclusively, it is a practical chemical dereplication strategy that can enhance the efficiency and has the potential to be a routine approach for the discovery of new natural compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  13. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  14. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  15. Componential distribution analysis of food using near infrared ray image

    Science.gov (United States)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  16. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  17. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  18. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  19. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  20. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  1. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  2. Material State Awareness for Composites Part II: Precursor Damage Analysis and Quantification of Degraded Material Properties Using Quantitative Ultrasonic Image Correlation (QUIC)

    Science.gov (United States)

    Patra, Subir; Banerjee, Sourav

    2017-01-01

    Material state awareness of composites using conventional Nondestructive Evaluation (NDE) method is limited by finding the size and the locations of the cracks and the delamination in a composite structure. To aid the progressive failure models using the slow growth criteria, the awareness of the precursor damage state and quantification of the degraded material properties is necessary, which is challenging using the current NDE methods. To quantify the material state, a new offline NDE method is reported herein. The new method named Quantitative Ultrasonic Image Correlation (QUIC) is devised, where the concept of microcontinuum mechanics is hybrid with the experimentally measured Ultrasonic wave parameters. This unique combination resulted in a parameter called Nonlocal Damage Entropy for the precursor awareness. High frequency (more than 25 MHz) scanning acoustic microscopy is employed for the proposed QUIC. Eight woven carbon-fiber-reinforced-plastic composite specimens were tested under fatigue up to 70% of their remaining useful life. During the first 30% of the life, the proposed nonlocal damage entropy is plotted to demonstrate the degradation of the material properties via awareness of the precursor damage state. Visual proofs for the precursor damage states are provided with the digital images obtained from the micro-optical microscopy, the scanning acoustic microscopy and the scanning electron microscopy. PMID:29258256

  3. Material State Awareness for Composites Part II: Precursor Damage Analysis and Quantification of Degraded Material Properties Using Quantitative Ultrasonic Image Correlation (QUIC

    Directory of Open Access Journals (Sweden)

    Subir Patra

    2017-12-01

    Full Text Available Material state awareness of composites using conventional Nondestructive Evaluation (NDE method is limited by finding the size and the locations of the cracks and the delamination in a composite structure. To aid the progressive failure models using the slow growth criteria, the awareness of the precursor damage state and quantification of the degraded material properties is necessary, which is challenging using the current NDE methods. To quantify the material state, a new offline NDE method is reported herein. The new method named Quantitative Ultrasonic Image Correlation (QUIC is devised, where the concept of microcontinuum mechanics is hybrid with the experimentally measured Ultrasonic wave parameters. This unique combination resulted in a parameter called Nonlocal Damage Entropy for the precursor awareness. High frequency (more than 25 MHz scanning acoustic microscopy is employed for the proposed QUIC. Eight woven carbon-fiber-reinforced-plastic composite specimens were tested under fatigue up to 70% of their remaining useful life. During the first 30% of the life, the proposed nonlocal damage entropy is plotted to demonstrate the degradation of the material properties via awareness of the precursor damage state. Visual proofs for the precursor damage states are provided with the digital images obtained from the micro-optical microscopy, the scanning acoustic microscopy and the scanning electron microscopy.

  4. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  5. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  6. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  7. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  8. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    Science.gov (United States)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  9. Lessons learned on probabilistic methodology for precursor analyses

    International Nuclear Information System (INIS)

    Babst, Siegfried; Wielenberg, Andreas; Gaenssmantel, Gerhard

    2016-01-01

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  10. Lessons learned on probabilistic methodology for precursor analyses

    Energy Technology Data Exchange (ETDEWEB)

    Babst, Siegfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Berlin (Germany); Wielenberg, Andreas; Gaenssmantel, Gerhard [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  11. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  12. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  13. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  14. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  15. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  16. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  17. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  18. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  19. Analysis of mechanical properties of N2in situ doped polycrystalline 3C-SiC thin films by chemical vapor deposition using single-precursor hexamethyildisilane

    International Nuclear Information System (INIS)

    Kim, Kang-San; Han, Ki-Bong; Chung, Gwiy-Sang

    2010-01-01

    This paper describes the mechanical properties of poly (polycrystalline) 3C-SiC thin films with N 2 in situ doping. In this work, in situ doped poly 3C-SiC film was deposited by using the atmospheric pressure chemical vapor deposition (APCVD) method at 1200 deg. C using single-precursor hexamethyildisilane: Si 2 (CH 3 ) 6 (HMDS) as Si and C precursors, and 0∼100 sccm N 2 as the dopant source gas. The mechanical properties of doped poly 3C-SiC thin films were measured by nano-indentation. Young's modulus and hardness were measured to be 285 and 35 GPa at 0 sccm N 2 , respectively. Young's modulus and hardness decreased with increasing N 2 flow rate. Surface morphology was evaluated by atomic force microscopy (AFM) according to N 2 flow rate.

  20. Molecular cloning and expression analysis of a cDNAs encoding androgenic gland hormone precursors from two Porcellionidae species, Porcellio scaber and P. dilatatus

    OpenAIRE

    Ohira, Tsuyoshi; Hasegawa, Yuriko; Okuno, Atsuro; Nagasawa, Hiromichi

    2003-01-01

    Male sexual characteristics in Crustacea are induced by androgenic gland hormone (AGH), which is produced by the male-specific androgenic gland. Recently, AGH in the terrestrial isopod Armadillidium vulgare was characterized and its cDNA cloned, the first example in which the structure of AGH was elucidated. We report here the molecular cloning of cDNAs encoding AGH precursors from two additional terrestrial isopods, Porcellio scaber and P. dilatatus. cDNA fragments encoding Porcellio scaber ...

  1. Analysis of acidic properties of distribution transformer oil insulation ...

    African Journals Online (AJOL)

    This paper examined the acidic properties of distribution transformer oil insulation in service at Jericho distribution network Ibadan, Nigeria. Five oil samples each from six distribution transformers (DT1, DT2, DT3, DT4 and DT5) making a total of thirty samples were taken from different installed distribution transformers all ...

  2. Reliability analysis of water distribution systems under uncertainty

    International Nuclear Information System (INIS)

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  3. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  4. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  5. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  6. Primitive Path Analysis and Stress Distribution in Highly Strained Macromolecules.

    Science.gov (United States)

    Hsu, Hsiao-Ping; Kremer, Kurt

    2018-01-16

    Polymer material properties are strongly affected by entanglement effects. For long polymer chains and composite materials, they are expected to be at the origin of many technically important phenomena, such as shear thinning or the Mullins effect, which microscopically can be related to topological constraints between chains. Starting from fully equilibrated highly entangled polymer melts, we investigate the effect of isochoric elongation on the entanglement structure and force distribution of such systems. Theoretically, the related viscoelastic response usually is discussed in terms of the tube model. We relate stress relaxation in the linear and nonlinear viscoelastic regimes to a primitive path analysis (PPA) and show that tension forces both along the original paths and along primitive paths, that is, the backbone of the tube, in the stretching direction correspond to each other. Unlike homogeneous relaxation along the chain contour, the PPA reveals a so far not observed long-lived clustering of topological constraints along the chains in the deformed state.

  7. Report on Fukushima Daiichi NPP precursor events

    International Nuclear Information System (INIS)

    2014-01-01

    The main questions to be answered by this report were: The Fukushima Daiichi NPP accident, could it have been prevented? If there is a next severe accident, may it be prevented? To answer the first question, the report addressed several aspects. First, the report investigated whether precursors to the Fukushima Daiichi NPP accident existed in the operating experience; second, the reasons why these precursors did not evolve into a severe accident. Third, whether lessons learned from these precursor events were adequately considered by member countries; and finally, if the operating experience feedback system needs to be improved, based on the previous analysis. To address the second question which is much more challenging, the report considered precursor events identified through a search and analysis of the IRS database and also precursors events based on risk significance. Both methods can point out areas where further work may be needed, even if it depends heavily on design and site-specific factors. From the operating experience side, more efforts are needed to ensure timely and full implementation of lessons learnt from precursor events. Concerning risk considerations, a combined use of risk precursors and operating experience may drive to effective changes to plants to reduce risk. The report also contains a short description and evaluation of selected precursors that are related to the course of the Fukushima Daiichi NPP accident. The report addresses the question whether operating experience feedback can be effectively used to identify plant vulnerabilities and minimize potential for severe core damage accidents. Based on several of the precursor events national or international in-depth evaluations were started. The vulnerability of NPPs due to external and internal flooding has clearly been addressed. In addition to the IRS based investigation, the WGRISK was asked to identify important precursor events based on risk significance. These precursors have

  8. Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis

    Directory of Open Access Journals (Sweden)

    Franco Rosselli Pilar

    1997-06-01

    Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.

  9. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    Science.gov (United States)

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  10. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2013-05-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with some licensed primary users under an interference temperature constraint. We assume that the DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit error rate performance metrics. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an analysis for a random vector quantization design algorithm. Specifically, the approximate statistics functions of the squared inner product between the optimal and quantized vectors are derived. With these statistics, we analyze the outage performance. Furthermore, the effects of channel estimation error and number of primary users on the system performance are investigated. Finally, optimal power adaptation and cochannel interference are considered and analyzed. Numerical and simulation results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  11. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  12. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  13. Biochemical Removal of HAP Precursors from Coal

    Energy Technology Data Exchange (ETDEWEB)

    Olson, Gregory J

    1997-05-12

    Column biooxidation tests with Kentucky coal confirmed results of earlier shake flask tests showing significant removal from the coal of arsenic, selenium, cobalt, manganese, nickel and cadmium. Rates of pyrite biooxidation in Kentucky coal were only slightly more than half the rates found previously for Indiana and Pittsburgh coals. Removal of pyrite from Pittsburgh coal by ferric ion oxidation slows markedly as ferrous ions accumulate in solution, requiring maintenance of high redox potentials in processes designed for removal of pyrite and hazardous air pollutant (HAP) precursors by circulation of ferric solutions through coal. The pyrite oxidation rates obtained in these tests were used by Unifield Engineering to support the conceptual designs for alternative pyrite and HAP precursor bioleaching processes for the phase 2 pilot plant. Thermophilic microorganisms were tested to determine if mercury could be mobilized from coal under elevated growth temperatures. There was no evidence for mercury removal from coal under these conditions. However, the activity of the organisms may have liberated mercury physically. It is also possible that the organisms dissolved mercury and it readsorbed to the clay preferentially. Both of these possibilities are undergoing further testing. The Idaho National Engineering and Environmental Laboratory's (INEEL) slurry column reactor was operated and several batches of feed coal, product coal, waste solids and leach solutions were submitted to LBL for HAP precursor analysis. Results to date indicate significant removal of mercury, arsenic and other HAP precursors in the combined physical-biological process.

  14. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  15. Finite element analysis of thermal stress distribution in different ...

    African Journals Online (AJOL)

    Nigerian Journal of Clinical Practice. Journal Home ... Von Mises and thermal stress distributions were evaluated. Results: In all ... distribution. Key words: Amalgam, finite element method, glass ionomer cement, resin composite, thermal stress ...

  16. Stability analysis of distributed order fractional chen system.

    Science.gov (United States)

    Aminikhah, H; Refahi Sheikhani, A; Rezazadeh, H

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results.

  17. Stability Analysis of Distributed Order Fractional Chen System

    Science.gov (United States)

    Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

  18. Factory Gate Pricing: An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    H.M. le Blanc; F. Cruijssen (Frans); H.A. Fleuren; M.B.M. de Koster (René)

    2004-01-01

    textabstractFactory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution. Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers. Owing to both the asymmetry in the distribution networks

  19. Factory Gate Pricing : An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    Le Blanc, H.M.; Cruijssen, F.C.A.M.; Fleuren, H.A.; de Koster, M.B.M.

    2004-01-01

    Factory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution.Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers.Owing to both the asymmetry in the distribution networks (the supplier

  20. Overproduction, purification, crystallization and preliminary X-ray analysis of human Fe65-PTB2 in complex with the amyloid precursor protein intracellular domain

    Energy Technology Data Exchange (ETDEWEB)

    Radzimanowski, Jens [Heidelberg University Biochemistry Center, INF328, D-69120 Heidelberg (Germany); Beyreuther, Konrad [Center for Molecular Biology, University Heidelberg, INF282, D-69120 Heidelberg (Germany); Sinning, Irmgard; Wild, Klemens, E-mail: klemens.wild@bzh.uni-heidelberg.de [Heidelberg University Biochemistry Center, INF328, D-69120 Heidelberg (Germany)

    2008-05-01

    Alzheimer’s disease is characterized by proteolytic processing of the amyloid precursor protein (APP), which releases the aggregation-prone amyloid-β (Aβ) peptide and liberates the intracellular domain (AICD) that interacts with various adaptor proteins. The crystallized AICD–Fe65-PTB2 complex is of central importance for APP translocation, nuclear signalling, processing and Aβ generation. Alzheimer’s disease is associated with typical brain deposits (senile plaques) that mainly contain the neurotoxic amyloid β peptide. This peptide results from proteolytic processing of the type I transmembrane protein amyloid precursor protein (APP). During this proteolytic pathway the APP intracellular domain (AICD) is released into the cytosol, where it associates with various adaptor proteins. The interaction of the AICD with the C-terminal phosphotyrosine-binding domain of Fe65 (Fe65-PTB2) regulates APP translocation, signalling and processing. Human AICD and Fe65-PTB2 have been cloned, overproduced and purified in large amounts in Escherichia coli. A complex of Fe65-PTB2 with the C-terminal 32 amino acids of the AICD gave well diffracting hexagonal crystals and data have been collected to 2.1 Å resolution. Initial phases obtained by the molecular-replacement method are of good quality and revealed well defined electron density for the substrate peptide.

  1. Formation and transformation of a short range ordered iron carbonate precursor

    DEFF Research Database (Denmark)

    Dideriksen, Knud; Frandsen, Cathrine; Bovet, Nicolas

    2015-01-01

    (II) with varying pH produced broad peaks in X-ray diffraction and contained dominantly Fe and CO3 when probed with X-ray photoelectron spectroscopy. Reduced pair distribution function (PDF) analysis shows only peaks corresponding to interatomic distances below 15Å, reflecting a material with no long range...... structural order. Moreover, PDF peak positions differ from those for known iron carbonates and hydroxides. Mössbauer spectra also deviate from those expected for known iron carbonates and suggest a less crystalline structure. These data show that a previously unidentified iron carbonate precursor phase...... formed. Its coherent scattering domains determined from PDF analysis are slightly larger than for amorphous calcium carbonate, suggesting that the precursor could be nanocrystalline. Replica exchange molecular dynamics simulations of Fe-carbonate polynuclear complexes yield PDF peak positions that agree...

  2. Metabolic Precursors to Amphetamine and Methamphetamine.

    Science.gov (United States)

    Cody, J D

    1993-12-01

    Analysis and interpretation of amphetamine results is a challenging process made difficult by a number of factors. One of the complications comes from determination of the origin of amphetamine or methamphetamine in a sample. Given the relatively rare occasions that either of these two drugs are prescribed, legal prescription of one of these drugs is seldom a reason for positive findings. A number of other precursor compounds are metabolized by the body to amphetamine or methamphetamine, many of which could be used for legitimate reasons. Fourteen different metabolic precursors of amphetamine or methamphetamine are included in this review. They are amphetaminil, benzphetamine, clobenzorex, deprenyl, dimethylamphetamine, ethylamphetamine, famprofazone, fencamine, fenethylline, fenproporex, furfenorex, mefenorex, mesocarb, and prenylamine. Medical use, metabolism, analysis, and interpretation are described to afford sufficient information to evaluate the possible involvement of these drugs in positive amphetamine or methamphetamine results. Copyright © 1993 Central Police University.

  3. Dynamic models for transient stability analysis of transmission and distribution systems with distributed generation : an overview

    NARCIS (Netherlands)

    Boemer, J.C.; Gibescu, M.; Kling, W.L.

    2009-01-01

    Distributed Generation is increasing in nowadays power systems. Small scale systems such as photovoltaic, biomass or small cogeneration plants are connected to the distribution level, while large wind farms will be connected to the transmission level. Both trends lead to a replacement of large

  4. Preparation of superconductor precursor powders

    Science.gov (United States)

    Bhattacharya, Raghunath

    1998-01-01

    A process for the preparation of a precursor metallic powder composition for use in the subsequent formation of a superconductor. The process comprises the steps of providing an electrodeposition bath comprising an electrolyte medium and a cathode substrate electrode, and providing to the bath one or more soluble salts of one or more respective metals which are capable of exhibiting superconductor properties upon subsequent appropriate treatment. The bath is continually energized to cause the metallic and/or reduced particles formed at the electrode to drop as a powder from the electrode into the bath, and this powder, which is a precursor powder for superconductor production, is recovered from the bath for subsequent treatment. The process permits direct inclusion of all metals in the preparation of the precursor powder, and yields an amorphous product mixed on an atomic scale to thereby impart inherent high reactivity. Superconductors which can be formed from the precursor powder include pellet and powder-in-tube products.

  5. Toward a theory of precursors

    International Nuclear Information System (INIS)

    Freivogel, Ben; Giddings, Steven B.; Lippert, Matthew

    2002-01-01

    To better understand the possible breakdown of locality in quantum gravitational systems, we pursue the identity of precursors in the context of the anti-de Sitter/conformal field theory correspondence. Holography implies a breakdown of standard bulk locality which we expect to occur only at extremely high energy. We consider precursors that encode bulk information causally disconnected from the boundary and whose measurement involves nonlocal bulk processes. We construct a toy model of holography which encapsulates the expected properties of precursors and compare it with previous such discussions. If these precursors can be identified in the gauge theory, they are almost certainly Wilson loops, perhaps with decorations, but the relevant information is encoded in the high-energy sector of the theory and should not be observable by low energy measurements. This would be in accord with the locality bound, which serves as a criterion for situations where breakdown of bulk locality is expected

  6. Rod internal pressure quantification and distribution analysis using Frapcon

    Energy Technology Data Exchange (ETDEWEB)

    Jessee, Matthew Anderson [ORNL; Wieselquist, William A [ORNL; Ivanov, Kostadin [Pennsylvania State University, University Park

    2015-09-01

    This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd

  7. Performance Analysis of the Consensus-Based Distributed LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Gonzalo Mateos

    2009-01-01

    Full Text Available Low-cost estimation of stationary signals and reduced-complexity tracking of nonstationary processes are well motivated tasks than can be accomplished using ad hoc wireless sensor networks (WSNs. To this end, a fully distributed least mean-square (D-LMS algorithm is developed in this paper, in which sensors exchange messages with single-hop neighbors to consent on the network-wide estimates adaptively. The novel approach does not require a Hamiltonian cycle or a special bridge subset of sensors, while communications among sensors are allowed to be noisy. A mean-square error (MSE performance analysis of D-LMS is conducted in the presence of a time-varying parameter vector, which adheres to a first-order autoregressive model. For sensor observations that are related to the parameter vector of interest via a linear Gaussian model and after adopting simplifying independence assumptions, exact closed-form expressions are derived for the global and sensor-level MSE evolution as well as its steady-state (s.s. values. Mean and MSE-sense stability of D-LMS are also established. Interestingly, extensive numerical tests demonstrate that for small step-sizes the results accurately extend to the pragmatic setting whereby sensors acquire temporally correlated, not necessarily Gaussian data.

  8. The ganga user interface for physics analysis and distributed resources

    CERN Document Server

    Soroko, A; Adams, D; Harrison, K; Charpentier, P; Maier, A; Mato, P; Moscicki, J T; Egede, U; Martyniak, J; Jones, R; Patrick, G N

    2004-01-01

    A physicist analysing data from the LHC experiments will have to deal with data and computing resources that are distributed across multiple locations and have different access methods. Ganga helps by providing a uniform high-level interface to the different low-level solutions for the required tasks, ranging from the specification of input data to the retrieval and post-processing of the output. For LHCb and ATLAS the goal is to assist in running jobs based on the Gaudi/Athena C++ framework. Ganga is written in python and presents the user with a single GUI rather than a set of different applications. It uses pluggable modules to interact with external tools for operations such as querying metadata catalogues, job configuration and job submission. At start-up, the user is presented with a list of templates for common analysis tasks, and information about ongoing tasks is stored from one invocation to the next. Ganga can also be used through a command line interface. This closely mirrors the functionality of ...

  9. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  10. Detecting Ionospheric Precursors of a Deep Earthquake (378.8 km on 7 July 2013, M w=7.2, in Papua New Guinea under a Geomagnetic Storm: Two-Dimensional Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Jyh-Woei Lin

    2013-07-01

    Full Text Available Two-dimensional ionospheric total electron content (TEC data were collected during the time period from 00:00 on 2 July to 12:00 UT on 08 July 2013. This period spanned 5 days before to 1 day after a deep earthquake (378.8 km in Papua New Guinea at 18:35:30 on 7 July 2013 UT (Mw=7.2. Data were examined by two-dimensional principal component analysis (2DPCA to detect TEC precursors related to the earthquake because TEC precursors have usually appeared in earlier time periods (Liu et al. 2006. A TEC precursor was highly localized around the epicenter on 6 July for 5 minutes, from 06:00 to 06:05. Ionizing radiation from radon gas release could possibly have caused the anomalous TEC fluctuation through, for example, a density variance. The plasma might have experienced large damping to cause short-term TEC fluctuations, and the gas released in a small amount in a short time period. 2DPCA can also identify short-term TEC fluctuations, but this fluctuation lasted for a considerable length of time. Other background TEC anomalies caused by the geomagnetic storm, small earthquakes and non-earthquake activities, e.g., equatorial ionization anomaly (EIA, resulted in small principal eigenvalues. Therefore, the detection of TEC precursors through large eigenvalues was not due to these background TEC anomalies.  Resumen Datos del contenido total de electrones ionosféricos en dos dimensiones (TEC fueron medidos durante el período del 2 de julio de 2013, a las 0:00:00 horas GMT., hasta las 12:00 GMT. del 8 de julio. En este lapso se abarcan cinco días antes y un día después de un terremoto profundo (378,8 kilómetros en Papúa Nueva Guinea, que se presentó a las 18:35:30 del 7 de julio (M w =7.2. Los datos fueron examina- dos a través de los componentes principales en dos dimensiones (2DPCA para detectar los precursores TEC relacionados al terremoto (Liu et al. 2006. Un precursor de los TEC fue localizado alrededor del epicentro el 6 de julio durante 5

  11. Precursor effect on the property and catalytic behavior of Fe-TS-1 in butadiene epoxidation

    Science.gov (United States)

    Wu, Mei; Zhao, Huahua; Yang, Jian; Zhao, Jun; Song, Huanling; Chou, Lingjun

    2017-11-01

    The effect of iron precursor on the property and catalytic behavior of iron modified titanium silicalite molecular sieve (Fe-TS-1) catalysts in butadiene selective epoxidation has been studied. Three Fe-TS-1 catalysts were prepared, using iron nitrate, iron chloride and iron sulfate as precursors, which played an important role in adjusting the textural properties and chemical states of TS-1. Of the prepared Fe-TS-1 catalysts, those modified by iron nitrate (FN-TS-1) exhibited a significant enhanced performance in butadiene selective epoxidation compared to those derived from iron sulfate (FS-TS-1) or iron chloride (FC-TS-1) precursors. To obtain a deep understanding of their structure-performance relationship, X-ray diffraction (XRD), X-ray photoelectron spectroscopy (XPS), Temperature programmed desorption of NH3 (NH3-TPD), Diffuse reflectance UV-Vis spectra (DR UV-Vis), Fourier transformed infrared spectra (FT-IR) and thermal gravimetric analysis (TGA) were conducted to characterize Fe-TS-1 catalysts. Experimental results indicated that textural structures and acid sites of modified catalysts as well as the type of Fe species influenced by the precursors were all responsible for the activity and product distribution.

  12. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  13. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  14. A Distributional Analysis of the Gender Wage Gap in Bangladesh

    OpenAIRE

    Salma Ahmed; Pushkar Maitra

    2011-01-01

    This paper decomposes the gender wage gap along the entire wage distribution into an endowment effect and a discrimination effect, taking into account possible selection into full-time employment. Applying a new decomposition approach to the Bangladesh Labour Force Survey (LFS) data we find that women are paid less than men every where on the wage distribution and the gap is higher at the lower end of the distribution. Discrimination against women is the primary determinant of the wage gap. W...

  15. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  16. Topographic precursors and geological structures of deep-seated catastrophic landslides caused by typhoon Talas, determined from the analysis of high-resolution DEMs

    Science.gov (United States)

    Chigira, Masahiro; Tsou, Ching-Ying; Matsushi, Yuki

    2013-04-01

    Typhoon Talas crossed the Japanese Islands between 2 and 5 September 2011, causing more than 70 deep-seated catastrophic landslides in a Jurassic to Paleogene-Early Miocene accretion complex. Detailed examination of the topographic features of 10 large landslides before the event, recorded on DEMs with a resolution of 1 m (based on airborne laser scanner surveys), showed that all of the landslides had small scarplets near their future crowns prior to the slide, and one landslide had linear depressions along its future crown as precursor topographic features. These scarplets and linear depressions were caused by gravitational slope deformation that preceded the catastrophic failure. Strains, defined by the ratio of the length of a scarplet to the length of the whole slope (as measured along the slope line), ranged from 5% to 21%, and are the first reliable numerical data relating to the topographic precursor features of large and catastrophic landslides. Careful examination of aerial photographs from another four large landslides, for which no high-resolution DEMs were available, suggested that they also developed scarplets at their heads beforehand, which are not precisely quantified. Twelve of the 14 landslides we surveyed in the field had sliding surfaces with wedge-shaped discontinuities that consisted of faults, shear surfaces that formed during accretion, and bedding, suggesting that the buildup of pore pressure occurs readily in a gravitationally deformed rock body containing wedge-shaped discontinuities. Other types of gravitational deformation were also active; e.g., flexural toppling and buckling were each observed to have preceded one landslide.

  17. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  18. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  19. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  20. Global Profiling and Novel Structure Discovery Using Multiple Neutral Loss/Precursor Ion Scanning Combined with Substructure Recognition and Statistical Analysis (MNPSS): Characterization of Terpene-Conjugated Curcuminoids in Curcuma longa as a Case Study.

    Science.gov (United States)

    Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min

    2016-01-05

    To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.

  1. Similarity Analysis for Reactor Flow Distribution Test and Its Validation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon; Ha, Jung Hui [Heungdeok IT Valley, Yongin (Korea, Republic of); Lee, Taehoo; Han, Ji Woong [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    facility. It was clearly found in Hong et al. In this study the feasibility of the similarity analysis of Hong et al. was examined. The similarity analysis was applied to SFR which has been designed in KAERI (Korea Atomic Energy Research Institute) in order to design the reactor flow distribution test. The length scale was assumed to be 1/5, and the velocity scale 1/2, which bounds the square root of the length scale (1/√5). The CFX calculations for both prototype and model were carried out and the flow field was compared.

  2. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  3. Skewness and kurtosis analysis for non-Gaussian distributions

    Science.gov (United States)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2018-06-01

    In this paper we address a number of pitfalls regarding the use of kurtosis as a measure of deviations from the Gaussian. We treat kurtosis in both its standard definition and that which arises in q-statistics, namely q-kurtosis. We have recently shown that the relation proposed by Cristelli et al. (2012) between skewness and kurtosis can only be verified for relatively small data sets, independently of the type of statistics chosen; however it fails for sufficiently large data sets, if the fourth moment of the distribution is finite. For infinite fourth moments, kurtosis is not defined as the size of the data set tends to infinity. For distributions with finite fourth moments, the size, N, of the data set for which the standard kurtosis saturates to a fixed value, depends on the deviation of the original distribution from the Gaussian. Nevertheless, using kurtosis as a criterion for deciding which distribution deviates further from the Gaussian can be misleading for small data sets, even for finite fourth moment distributions. Going over to q-statistics, we find that although the value of q-kurtosis is finite in the range of 0 < q < 3, this quantity is not useful for comparing different non-Gaussian distributed data sets, unless the appropriate q value, which truly characterizes the data set of interest, is chosen. Finally, we propose a method to determine the correct q value and thereby to compute the q-kurtosis of q-Gaussian distributed data sets.

  4. Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar

    Directory of Open Access Journals (Sweden)

    Graham V. Weinberg

    2012-01-01

    Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.

  5. The gluon distribution at small x - a phenomenological analysis

    International Nuclear Information System (INIS)

    Harriman, P.N.; Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1990-03-01

    The size of the gluon distribution at small χ has important implications for phenomenology at future high energy hadron-hadron and lepton-hadron colliders. We extend a recent global parton distribution fit to investigate the constraints on the gluon from deep inelastic and prompt photon data. In particular, we estimate a band of allowed gluon distributions with qualitatively small-χ behaviour and study the implications of these on a variety of cross sections at high energy pp and ep colliders. (author)

  6. In situ analysis of elemental depth distributions in thin films by combined evaluation of synchrotron x-ray fluorescence and diffraction

    International Nuclear Information System (INIS)

    Mainz, R.; Klenk, R.

    2011-01-01

    In this work we present a method for the in situ analysis of elemental depth distributions in thin films using a combined evaluation of synchrotron x-ray fluorescence and energy-dispersive x-ray diffraction signals. We recorded diffraction and fluorescence signals simultaneously during the reactive annealing of thin films. By means of the observed diffraction signals, the time evolution of phases in the thin films during the annealing processes can be determined. We utilized this phase information to parameterize the depth distributions of the elements in the films. The time-dependent fluorescence signals were then taken to determine the parameters representing the parameterized depth distributions. For this latter step, we numerically calculated the fluorescence intensities for a given set of depth distributions. These calculations handle polychromatic excitation and arbitrary functions of depth distributions and take into account primary and secondary fluorescence. Influences of lateral non-uniformities of the films, as well as the accuracy limits of the method, are investigated. We apply the introduced method to analyze the evolution of elemental depth distributions and to quantify the kinetic parameters during a synthesis process of CuInS 2 thin films via the reactive annealing of Cu-In precursors in a sulfur atmosphere.

  7. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    Science.gov (United States)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    The Hellenic Subduction Zone (HSZ) is the most seismically active region in Europe. Many destructive earthquakes have taken place along the HSZ in the past. The evolution of such active regions is expressed through seismicity and is characterized by complex phenomenology. The understanding of the tectonic evolution process and the physical state of subducting regimes is crucial in earthquake prediction. In recent years, there is a growing interest concerning an approach to seismicity based on the science of complex systems (Papadakis et al., 2013; Vallianatos et al., 2012). In this study we calculate the fractal dimension of the spatial distribution of earthquakes along the HSZ and we aim to understand the significance of the obtained values to the tectonic and geodynamic evolution of this area. We use the external seismic sources provided by Papaioannou and Papazachos (2000) to create a dataset regarding the subduction zone. According to the aforementioned authors, we define five seismic zones. Then, we structure an earthquake dataset which is based on the updated and extended earthquake catalogue for Greece and the adjacent areas by Makropoulos et al. (2012), covering the period 1976-2009. The fractal dimension of the spatial distribution of earthquakes is calculated for each seismic zone and for the HSZ as a unified system using the box-counting method (Turcotte, 1997; Robertson et al., 1995; Caneva and Smirnov, 2004). Moreover, the variation of the fractal dimension is demonstrated in different time windows. These spatiotemporal variations could be used as an additional index to inform us about the physical state of each seismic zone. As a precursor in earthquake forecasting, the use of the fractal dimension appears to be a very interesting future work. Acknowledgements Giorgos Papadakis wish to acknowledge the Greek State Scholarships Foundation (IKY). References Caneva, A., Smirnov, V., 2004. Using the fractal dimension of earthquake distributions and the

  8. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...

  9. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...

  10. analysis of acidic properties of distribution transformer oil insulation

    African Journals Online (AJOL)

    user

    The system detects when the acid- ... rated above 500 kVA are classed as power transformers. Transformers rated at ... generate great impact in safety, reliability and cost of the electric ... the primary voltage of the electric distribution system to.

  11. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  12. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River ... of taxa was recorded in marginal vegetation in the channels and lagoons, ... highlights the importance of maintaining a mosaic of aquatic habitats in the Delta.

  13. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  14. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River Delta, ... seasonally-flooded pools and temporary rain-filled pools in MGR and CI. ... biodiversity of the Okavango Delta, thereby contributing to its conservation.

  15. Distributed Multiscale Data Analysis and Processing for Sensor Networks

    National Research Council Canada - National Science Library

    Wagner, Raymond; Sarvotham, Shriram; Choi, Hyeokho; Baraniuk, Richard

    2005-01-01

    .... Second, the communication overhead of multiscale algorithms can become prohibitive. In this paper, we take a first step in addressing both shortcomings by introducing two new distributed multiresolution transforms...

  16. Tradespace Analysis Tool for Designing Earth Science Distributed Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The ESTO 2030 Science Vision envisions the future of Earth Science to be characterized by 'many more distributed observations,' and 'formation-flying [missions that]...

  17. Data analysis and mapping of the mountain permafrost distribution

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    the permafrost occurrence where it is unknown, the mentioned supervised learning techniques inferred a classification function from labelled training data (pixels of permafrost absence and presence). A particular attention was given to the pre-processing of the dataset, with the study of its complexity and the relation between permafrost data and employed environmental variables. The application of feature selection techniques completed this analysis and informed about redundant or valueless predictors. Classification performances were assessed with AUROC on independent validation sets (0.81 for LR, 0.85 with SVM and 0.88 with RF). At the micro scale obtained permafrost maps illustrate consistent results compared to the field reality thanks to the high resolution of the dataset (10 meters). Moreover, compared to classical models, the permafrost prediction is computed without recurring to altitude thresholds (above which permafrost may be found). Finally, as machine learning is a non-deterministic approach, mountain permafrost distribution maps are presented and discussed with corresponding uncertainties maps, which provide information on the quality of the results.

  18. Spatial Distribution Analysis of Scrub Typhus in Korea

    OpenAIRE

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does...

  19. Analysis of transverse field distributions in Porro prism resonators

    Science.gov (United States)

    Litvin, Igor A.; Burger, Liesl; Forbes, Andrew

    2007-05-01

    A model to describe the transverse field distribution of the output beam from porro prism resonators is proposed. The model allows the prediction of the output transverse field distribution by assuming that the main areas of loss are located at the apexes of the porro prisms. Experimental work on a particular system showed some interested correlations between the time domain behavior of the resonator and the transverse field output. These findings are presented and discussed.

  20. Analysis of Strengthening Steel Distribution Channel in Domestic Automotive Industry

    OpenAIRE

    Pangraksa, Sugeng; Djajadiningrat, Surna Tjahja

    2013-01-01

    Distribution has strategic role to spread up product from manufacturer into end-user. Automotive industry needs distribution channel which has: excellent data management, timely delivery management, excellent quality management, and competitive reducing cost. Krakatau Steel (KS) distributors has weaknesses to enter automotive market current that require tight prerequisite such as: consistency of product quality, good cooperation, close relationship, continuously cost reduction, wide spread to...

  1. Distributed generation: An empirical analysis of primary motivators

    International Nuclear Information System (INIS)

    Carley, Sanya

    2009-01-01

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  2. Distributed generation: An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)], E-mail: scarley@email.unc.edu

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  3. Distributed generation. An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment. (author)

  4. Multi precursors analysis associated with the powerful Ecuador (MW = 7.8) earthquake of 16 April 2016 using Swarm satellites data in conjunction with other multi-platform satellite and ground data

    Science.gov (United States)

    Akhoondzadeh, Mehdi; De Santis, Angelo; Marchetti, Dedalo; Piscini, Alessandro; Cianchini, Gianfranco

    2018-01-01

    After DEMETER satellite mission (2004-2010), the launch of the Swarm satellites (Alpha (A), Bravo (B) and Charlie (C)) has created a new opportunity in the study of earthquake ionospheric precursors. Nowadays, there is no doubt that multi precursors analysis is a necessary phase to better understand the LAIC (Lithosphere Atmosphere Ionosphere Coupling) mechanism before large earthquakes. In this study, using absolute scalar magnetometer, vector field magnetometer and electric field instrument on board Swarm satellites, GPS (Global Positioning System) measurements, MODIS-Aqua satellite and ECMWF (European Centre for Medium-Range Weather Forecasts) data, the variations of the electron density and temperature, magnetic field, TEC (Total Electron Content), LST (Land Surface Temperature), AOD (Aerosol Optical Depth) and SKT (SKin Temperature) have been surveyed to find the potential seismic anomalies around the strong Ecuador (Mw = 7.8) earthquake of 16 April 2016. The four solar and geomagnetic indices: F10.7, Dst, Kp and ap were investigated to distinguish whether the preliminary detected anomalies might be associated with the solar-geomagnetic activities instead of the seismo-ionospheric anomalies. The Swarm satellites (A, B and C) data analysis indicate the anomalies in time series of electron density variations on 7, 11 and 12 days before the event; the unusual variations in time series of electron temperature on 8 days preceding the earthquake; the analysis of the magnetic field scalar and vectors data show the considerable anomalies 52, 48, 23, 16, 11, 9 and 7 days before the main shock. A striking anomaly is detected in TEC variations on 1 day before earthquake at 9:00 UTC. The analysis of MODIS-Aqua night-time images shows that LST increase unusually on 11 days prior to main shock. In addition, the AOD variations obtained from MODIS measurements reach the maximum value on 10 days before the earthquake. The SKT around epicentral region presents anomalous higher

  5. PanDA: distributed production and distributed analysis system for ATLAS

    International Nuclear Information System (INIS)

    Maeno, T

    2008-01-01

    A new distributed software system was developed in the fall of 2005 for the ATLAS experiment at the LHC. This system, called PANDA, provides an integrated service architecture with late binding of jobs, maximal automation through layered services, tight binding with ATLAS Distributed Data Management system [1], advanced error discovery and recovery procedures, and other features. In this talk, we will describe the PANDA software system. Special emphasis will be placed on the evolution of PANDA based on one and half year of real experience in carrying out Computer System Commissioning data production [2] for ATLAS. The architecture of PANDA is well suited for the computing needs of the ATLAS experiment, which is expected to be one of the first HEP experiments to operate at the petabyte scale

  6. The Aggregation of Individual Distributive Preferences through the Distributive Liberal Social Contract : Normative Analysis.

    OpenAIRE

    Jean Mercier-Ythier

    2010-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is both Pareto-efficient relative to individual interdependent preferences, and unanimously weak...

  7. Sample path analysis and distributions of boundary crossing times

    CERN Document Server

    Zacks, Shelemyahu

    2017-01-01

    This monograph is focused on the derivations of exact distributions of first boundary crossing times of Poisson processes, compound Poisson processes, and more general renewal processes.  The content is limited to the distributions of first boundary crossing times and their applications to various stochastic models. This book provides the theory and techniques for exact computations of distributions and moments of level crossing times. In addition, these techniques could replace simulations in many cases, thus providing more insight about the phenomenona studied. This book takes a general approach for studying telegraph processes and is based on nearly thirty published papers by the author and collaborators over the past twenty five years.  No prior knowledge of advanced probability is required, making the book widely available to students and researchers in applied probability, operations research, applied physics, and applied mathematics. .

  8. A digital elevation analysis: Spatially distributed flow apportioning algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang-Hyun; Kim, Kyung-Hyun [Pusan National University, Pusan(Korea); Jung, Sun-Hee [Korea Environment Institute, (Korea)

    2001-06-30

    A flow determination algorithm is proposed for the distributed hydrologic model. The advantages of a single flow direction scheme and multiple flow direction schemes are selectively considered to address the drawbacks of existing algorithms. A spatially varied flow apportioning factor is introduced in order to accommodate the accumulated area from upslope cells. The channel initiation threshold area(CIT) concept is expanded and integrated into the spatially distributed flow apportioning algorithm in order to delineate a realistic channel network. An application of a field example suggests that the linearly distributed flow apportioning scheme provides some advantages over existing approaches, such as the relaxation of over-dissipation problems near channel cells, the connectivity feature of river cells, the continuity of saturated areas and the negligence of the optimization of few parameters in existing algorithms. The effects of grid sizes are explored spatially as well as statistically. (author). 28 refs., 7 figs.

  9. Simulation and energy analysis of distributed electric heating system

    Science.gov (United States)

    Yu, Bo; Han, Shenchao; Yang, Yanchun; Liu, Mingyuan

    2018-02-01

    Distributed electric heating system assistssolar heating systemby using air-source heat pump. Air-source heat pump as auxiliary heat sourcecan make up the defects of the conventional solar thermal system can provide a 24 - hour high - efficiency work. It has certain practical value and practical significance to reduce emissions and promote building energy efficiency. Using Polysun software the system is simulated and compared with ordinary electric boiler heating system. The simulation results show that upon energy request, 5844.5kW energy is saved and 3135kg carbon - dioxide emissions are reduced and5844.5 kWhfuel and energy consumption is decreased with distributed electric heating system. Theeffect of conserving energy and reducing emissions using distributed electric heating systemis very obvious.

  10. Interactive microbial distribution analysis using BioAtlas

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    2017-01-01

    body maps and (iii) user-defined maps. It further allows for (iv) uploading of own sample data, which can be placed on existing maps to (v) browse the distribution of the associated taxonomies. Finally, BioAtlas enables users to (vi) contribute custom maps (e.g. for plants or animals) and to map...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  11. Core Flow Distribution from Coupled Supercritical Water Reactor Analysis

    Directory of Open Access Journals (Sweden)

    Po Hu

    2014-01-01

    Full Text Available This paper introduces an extended code package PARCS/RELAP5 to analyze steady state of SCWR US reference design. An 8 × 8 quarter core model in PARCS and a reactor core model in RELAP5 are used to study the core flow distribution under various steady state conditions. The possibility of moderator flow reversal is found in some hot moderator channels. Different moderator flow orifice strategies, both uniform across the core and nonuniform based on the power distribution, are explored with the goal of preventing the reversal.

  12. Energy efficiency analysis of reconfigured distribution system for practical loads

    Directory of Open Access Journals (Sweden)

    Pawan Kumar

    2016-09-01

    Full Text Available In deregulated rate structure, the performance evaluation of distribution system for energy efficiency includes; loss minimization, improved power quality, loadability limit, reliability and availability of supply. Energy efficiency changes with the variation in loading pattern and the load behaviour. Further, the nature of load at each node is not explicitly of any one type rather their characteristics depend upon the node voltages. In most cases, load is assumed to be constant power (real and reactive. In this paper voltage dependent practical loads are represented with composite load model and the energy efficiency performance of distribution system for practical loads is evaluated in different configurations of 33-node system.

  13. Analysis of 17 neurotransmitters, metabolites and precursors in zebrafish through the life cycle using ultrahigh performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Santos-Fandila, A; Vázquez, E; Barranco, A; Zafra-Gómez, A; Navalón, A; Rueda, R; Ramírez, M

    2015-09-15

    An ultrahigh performance liquid chromatography-tandem mass spectrometry method for the identification and quantification of neurotransmitters, metabolites and precursors at different stages in zebrafish life was developed. Betaine, glutamine, glutamic acid, γ-aminobutyric acid, phosphocholine, glycerophosphocholine, cytidine 5'-diphosphocholine, choline, acetylcholine, dopamine, norepinephrine, serotonin, tyrosine, epinephrine, tryptophan, 5-hydroxyindolacetic acid and agmatine were selected as analytes. The method consisted of a simple deproteinization of samples using methanol and formic acid, subsequent injection onto the chromatographic equipment and quantification with a triple quadrupole mass spectrometer detector using an electrospray ionization interface in positive mode. Limits of detection ranged from 0.02 to 11ngmL(-1) and limits of quantification from 0.1 to 38ngmL(-1), depending on the analyte. The method was validated according to US Food and Drugs Administration (FDA) guideline for bioanalytical assays. Precision, expressed as relative standard deviation (%RSD), was lower than 15% in all cases, and the determination coefficient (R(2)) was equal or higher than 99.0% with a residual deviation for each calibration point lower than ±25%. Mean recoveries were between 85% and 115%. The method was applied to determine of these compounds in zebrafish from early stages of development to adulthood and showed the time-course of neurotransmitters and others neurocompounds through the life cycle. The possibility of measuring up to 17 compounds related with the main neurotransmitter systems in a simple analytical method will complement and reinforce the use of zebrafish in multiple applications in the field of neurosciences. The proposed method will facilitate future studies related with brain development. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Analysis of temperature distribution in a heat conducting fiber with ...

    African Journals Online (AJOL)

    The temperature distribution in a heat conducting fiber is computed using the Galerkin Finite Element Method in the present study. The weak form of the governing differential equation is obtained and nodal temperatures for linear and quadratic interpolation functions for different mesh densities are calculated for Neumann ...

  15. Polybutadiene latex particle size distribution analysis utilizing a disk centrifuge

    NARCIS (Netherlands)

    Verdurmen, E.M.F.J.; Albers, J.G.; German, A.L.

    1994-01-01

    Polybutadiene (I) latexes prepd. by emulsifier-free emulsion polymn. and having particle diam. 50-300 nm for both unimodal and bimodal particles size distributions were analyzed by the line-start (LIST) method in a Brookhaven disk centrifuge photosedimentometer. A special spin fluid was designed to

  16. Resonance analysis in parallel voltage-controlled Distributed Generation inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2013-01-01

    Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops...

  17. Analysis of the Relationship between Shared Leadership and Distributed Leadership

    Science.gov (United States)

    Goksoy, Suleyman

    2016-01-01

    Problem Statement: The current study's purpose is: First, to examine the relationship between shared leadership and distributed leadership, which, despite having many similar aspects in theory and practice, are defined as separate concepts. Second, to compare the two approaches and dissipate the theoretical contradictions. In this sense, the main…

  18. Comparative Analysis of Possible Designs for Flexible Distribution System Operation

    DEFF Research Database (Denmark)

    Lin, Jeremy; Knezovic, Katarina

    2016-01-01

    for achieving the most efficient utilization of these resources while meeting the forecasted load. In this paper, we present possible system design frameworks proposed for flexible distribution system operation. Critical evaluations and comparison of these models are made based on a number of key attributes...

  19. A formal analysis of a dynamic distributed spanning tree algorithm

    NARCIS (Netherlands)

    Mooij, A.J.; Wesselink, J.W.

    2003-01-01

    Abstract. We analyze the spanning tree algorithm in the IEEE 1394.1 draft standard, which correctness has not previously been proved. This algorithm is a fully-dynamic distributed graph algorithm, which, in general, is hard to develop. The approach we use is to formally develop an algorithm that is

  20. Analysis Of Rainfall Distribution In Owerri And Enugu, Nigeria Using ...

    African Journals Online (AJOL)

    The precipitation concentration index (PCI) of Owerri and Enugu for 1974 to 2011 was computed to characterise the rainfall distribution for both locations. The PCI was estimated on an annual and seasonal scale. The seasonal estimation was based on the categorisation of the seasons in eastern Nigeria into long wet ...

  1. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available optimisation of a water distribution system by keeping the average pressure unchanged as water demands change, by changing the speed of the pumps. Another application area considered, using the same mathematical notions, is the study of the sensitivity...

  2. Chemical bonding and charge density distribution analysis of ...

    Indian Academy of Sciences (India)

    tice and the electron density distributions in the unit cell of the samples were investigated. Structural ... titanium and oxygen ions and predominant ionic nature between barium and oxygen ions. Average grain sizes ... trations (at <1%) is responsible for the formation of .... indicated by dots and calculated powder patterns are.

  3. Bidirectional reflectance distribution function measurements and analysis of retroreflective materials.

    Science.gov (United States)

    Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

    2014-12-01

    We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.

  4. Synthesis of labelled ecdysone precursors

    International Nuclear Information System (INIS)

    Haag, T.; Hetru, C.; Nakatani, Y.; Luu, B.; Meister, M.; Pichat, L.; Audinot, M.

    1985-01-01

    High specific activity tritiated 3β,14α-dihydroxy-5β-cholest-7-en-6-one, has been prepared using a precursor which permits rapid and easy labelling. This compound is converted to ecdysone under in vitro conditions by insect prothoracic glands, a well known site of ecdysone biosynthesis. (author)

  5. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  6. Deposition on disordered substrates with precursor layer diffusion

    Science.gov (United States)

    Filipe, J. A. N.; Rodgers, G. J.; Tavassoli, Z.

    1998-09-01

    Recently we introduced a one-dimensional accelerated random sequential adsorption process as a model for chemisorption with precursor layer diffusion. In this paper we consider this deposition process on disordered or impure substrates. The problem is solved exactly on both the lattice and continuum and for various impurity distributions. The results are compared with those from the standard random sequential adsorption model.

  7. Dynamical Analysis of SIR Epidemic Models with Distributed Delay

    Directory of Open Access Journals (Sweden)

    Wencai Zhao

    2013-01-01

    Full Text Available SIR epidemic models with distributed delay are proposed. Firstly, the dynamical behaviors of the model without vaccination are studied. Using the Jacobian matrix, the stability of the equilibrium points of the system without vaccination is analyzed. The basic reproduction number R is got. In order to study the important role of vaccination to prevent diseases, the model with distributed delay under impulsive vaccination is formulated. And the sufficient conditions of globally asymptotic stability of “infection-free” periodic solution and the permanence of the model are obtained by using Floquet’s theorem, small-amplitude perturbation skills, and comparison theorem. Lastly, numerical simulation is presented to illustrate our main conclusions that vaccination has significant effects on the dynamical behaviors of the model. The results can provide effective tactic basis for the practical infectious disease prevention.

  8. Analysis of the logistics processes in the wine distribution

    OpenAIRE

    Slavkovský, Matúš

    2011-01-01

    Master's thesis is referring the importance of logistics in the retail business and the importance of reducing logistics costs. It includes so theoretical knowledge as well as the analysis of the relevant markets, which are producing and consuming wine in the largest quantities. Thesis is focused on analysis of the logistical processes and costs of an e-shop. Based on this analysis measures to improve the logistics of the process of the company are proposed. The goal of the Master's thesis is...

  9. Distributed Scheduling in Time Dependent Environments: Algorithms and Analysis

    OpenAIRE

    Shmuel, Ori; Cohen, Asaf; Gurewitz, Omer

    2017-01-01

    Consider the problem of a multiple access channel in a time dependent environment with a large number of users. In such a system, mostly due to practical constraints (e.g., decoding complexity), not all users can be scheduled together, and usually only one user may transmit at any given time. Assuming a distributed, opportunistic scheduling algorithm, we analyse the system's properties, such as delay, QoS and capacity scaling laws. Specifically, we start with analyzing the performance while \\...

  10. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  11. HPC Performance Analysis of a Distributed Information Enterprise Simulation

    National Research Council Canada - National Science Library

    Hanna, James P; Walter, Martin J; Hillman, Robert G

    2004-01-01

    .... The analysis identified several performance limitations and bottlenecks. One critical limitation addressed and eliminated was simultaneously mixing a periodic process model with an event driven model causing rollbacks...

  12. Finite element analysis for temperature distributions in a cold forging

    International Nuclear Information System (INIS)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong; Kim, Sung Wook; Song, In Chul; Jeon, Byung Cheol

    2013-01-01

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  13. Finite element analysis for temperature distributions in a cold forging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong [Chungbuk National University, Cheongju (Korea, Republic of); Kim, Sung Wook [Yanbian National University, Yanbian (China); Song, In Chul; Jeon, Byung Cheol [Sunil dyfas, Jincheon (Korea, Republic of)

    2013-10-15

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  14. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  15. Analysis of magnetic electron lens with secant hyperbolic field distribution

    International Nuclear Information System (INIS)

    Pany, S.S.; Ahmed, Z.; Dubey, B.P.

    2014-01-01

    Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of the electron beam. Indicators of imaging performance of these instruments, like spatial resolution, have a strong correlation with the focal characteristics of the magnetic lenses, which in turn have been shown to be sensitive to the details of the spatial distribution of the axial magnetic field. Owing to the complexity of designing practical lenses, empirical mathematical expressions are important to obtain the desired focal properties. Thus the degree of accuracy of such models in representing the actual field distribution determines accuracy of the calculations and ultimately the performance of the lens. Historically, the mathematical models proposed by Glaser [1] and Ramberg [2] have been extensively used. In this paper the authors discuss another model with a secant-hyperbolic type magnetic field distribution function, and present a comparison between models, utilizing results from finite element-based field simulations as the reference for evaluating performance

  16. The Emotions of Abstract Words: A Distributional Semantic Analysis.

    Science.gov (United States)

    Lenci, Alessandro; Lebani, Gianluca E; Passaro, Lucia C

    2018-04-06

    Recent psycholinguistic and neuroscientific research has emphasized the crucial role of emotions for abstract words, which would be grounded by affective experience, instead of a sensorimotor one. The hypothesis of affective embodiment has been proposed as an alternative to the idea that abstract words are linguistically coded and that linguistic processing plays a key role in their acquisition and processing. In this paper, we use distributional semantic models to explore the complex interplay between linguistic and affective information in the representation of abstract words. Distributional analyses on Italian norming data show that abstract words have more affective content and tend to co-occur with contexts with higher emotive values, according to affective statistical indices estimated in terms of distributional similarity with a restricted number of seed words strongly associated with a set of basic emotions. Therefore, the strong affective content of abstract words might just be an indirect byproduct of co-occurrence statistics. This is consistent with a version of representational pluralism in which concepts that are fully embodied either at the sensorimotor or at the affective level live side-by-side with concepts only indirectly embodied via their linguistic associations with other embodied words. Copyright © 2018 Cognitive Science Society, Inc.

  17. Simulation and analysis of the soot particle size distribution in a turbulent nonpremixed flame

    KAUST Repository

    Lucchesi, Marco

    2017-02-05

    A modeling framework based on Direct Simulation Monte Carlo (DSMC) is employed to simulate the evolution of the soot particle size distribution in turbulent sooting flames. The stochastic reactor describes the evolution of soot in fluid parcels following Lagrangian trajectories in a turbulent flow field. The trajectories are sampled from a Direct Numerical Simulation (DNS) of a n-heptane turbulent nonpremixed flame. The DSMC method is validated against experimentally measured size distributions in laminar premixed flames and found to reproduce quantitatively the experimental results, including the appearance of the second mode at large aggregate sizes and the presence of a trough at mobility diameters in the range 3–8 nm. The model is then applied to the simulation of soot formation and growth in simplified configurations featuring a constant concentration of soot precursors and the evolution of the size distribution in time is found to depend on the intensity of the nucleation rate. Higher nucleation rates lead to a higher peak in number density and to the size distribution attaining its second mode sooner. The ensemble-averaged PSDF in the turbulent flame is computed from individual samples of the PSDF from large sets of Lagrangian trajectories. This statistical measure is equivalent to time-averaged, scanning mobility particle size (SMPS) measurements in turbulent flames. Although individual trajectories display strong bimodality as in laminar flames, the ensemble-average PSDF possesses only one mode and a long, broad tail, which implies significant polydispersity induced by turbulence. Our results agree very well with SMPS measurements available in the literature. Conditioning on key features of the trajectory, such as mixture fraction or radial locations does not reduce the scatter in the size distributions and the ensemble-averaged PSDF remains broad. The results highlight and explain the important role of turbulence in broadening the size distribution of

  18. Current issues and challenges in global analysis of parton distributions

    International Nuclear Information System (INIS)

    Tung, Wu-Ki

    2007-01-01

    A new implementation of precise perturbative QCD calculation of deep inelastic scattering structure functions and cross sections, incorporating heavy quark mass effects, is applied to the global analysis of the full HERA I data sets on NC and CC cross sections, in conjunction with other experiments. Improved agreement between the NLO QCD theory and the global data sets are obtained. Comparison of the new results to that of previous analysis based on conventional zero-mass parton formalism is made. Exploratory work on implications of new fixed-target neutrino scattering and Drell-Yan data on global analysis is also discussed. (author)

  19. BioAtlas: Interactive web service for microbial distribution analysis

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    Massive amounts of 16S rRNA sequencing data have been stored in publicly accessible databases, such as GOLD, SILVA, GreenGenes (GG), and the Ribosomal Database Project (RDP). Many of these sequences are tagged with geo-locations. Nevertheless, researchers currently lack a user-friendly tool...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  20. Analysis of the porosity distribution of mixed oxide pins

    International Nuclear Information System (INIS)

    Lieblich, M.; Lopez, J.

    1987-01-01

    In the frame of the Joint Irradiation Program IVO-FR2-Vg7 between the Centre of Nuclear Research of Karlsruhe (KfK), the irradiation of 30 mixed-oxide fuel rods in the FR2 experimental reactor was carried out. The pins were located in 10 single-walled NaK capsules. The behaviour of the fuel during its burnup was studied, mainly, the rest-porosity and cracking distribution in the pellet, partial densification, etc. In this work 3 pins from the capsule No. 165 were analyzed. The experimental results (pore and cracking profiles) were interpreted by the fuel rod code SATURN. (Author) 20 refs

  1. Analysis and Synthesis of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    like automotive electronics, real-time multimedia, avionics, medical equipment, and factory systems. The proposed analysis and synthesis techniques derive optimized implementations that fulfill the imposed design constraints. An important part of the implementation process is the synthesis...

  2. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  3. Finite element analysis of thermal stress distribution in different ...

    African Journals Online (AJOL)

    Nigerian Journal of Clinical Practice • Jan-Feb 2016 • Vol 19 • Issue 1. Abstract ... Key words: Amalgam, finite element method, glass ionomer cement, resin composite, thermal stress ... applications for force analysis and assessment of different.

  4. Archiving, Distribution and Analysis of Solar-B Data

    Science.gov (United States)

    Shimojo, M.

    2007-10-01

    The Solar-B Mission Operation and Data Analysis (MODA) working group has been discussing the data analysis system for Solar-B data since 2001. In the paper, based on the Solar-B MODA document and the recent work in Japan, we introduce the dataflow from Solar-B to scientists, the data format and data-level of Solar-B data, and the data searching/providing system.

  5. Fuel distribution process risk analysis in East Borneo

    Directory of Open Access Journals (Sweden)

    Laksmita Raizsa

    2018-01-01

    Full Text Available Fuel distribution is an important aspect of fulfilling the customer’s need. It is risky because it can cause tardiness that can cause fuel scarcity. In the process of distribution, many risks are occurring. House of Risk is a method used for mitigating the risk. It identifies seven risk events and nine risk agents. Matrix occurrence and severity are used for eliminating the minor impact risk. House of Risk 1 is used for determining the Aggregate Risk Potential (ARP. Pareto diagram is applied to prioritize risk that must be mitigated by preventive actions based on ARP. It identifies 4 priority risks, namely A8 (Car trouble, A4 (Human Error, A3 (Error deposit via bank and underpayment, and A6 (traffic accident which should be mitigated. House of Risk 2 makes for mapping between the preventive action and risk agent. It gets the Effectiveness to Difficulty Ratio (ETD for mitigating action. Conducting safety talk routine once every three days with ETD 2088 is the primary preventive actions.

  6. Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

    2009-05-22

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

  7. Distributed sensing signal analysis of deformable plate/membrane mirrors

    Science.gov (United States)

    Lu, Yifan; Yue, Honghao; Deng, Zongquan; Tzou, Hornsen

    2017-11-01

    Deformable optical mirrors usually play key roles in aerospace and optical structural systems applied to space telescopes, radars, solar collectors, communication antennas, etc. Limited by the payload capacity of current launch vehicles, the deformable mirrors should be lightweight and are generally made of ultra-thin plates or even membranes. These plate/membrane mirrors are susceptible to external excitations and this may lead to surface inaccuracy and jeopardize relevant working performance. In order to investigate the modal vibration characteristics of the mirror, a piezoelectric layer is fully laminated on its non-reflective side to serve as sensors. The piezoelectric layer is segmented into infinitesimal elements so that microscopic distributed sensing signals can be explored. In this paper, the deformable mirror is modeled as a pre-tensioned plate and membrane respectively and sensing signal distributions of the two models are compared. Different pre-tensioning forces are also applied to reveal the tension effects on the mode shape and sensing signals of the mirror. Analytical results in this study could be used as guideline of optimal sensor/actuator placement for deformable space mirrors.

  8. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  9. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  10. The Innate Lymphoid Cell Precursor.

    Science.gov (United States)

    Ishizuka, Isabel E; Constantinides, Michael G; Gudjonson, Herman; Bendelac, Albert

    2016-05-20

    The discovery of tissue-resident innate lymphoid cell populations effecting different forms of type 1, 2, and 3 immunity; tissue repair; and immune regulation has transformed our understanding of mucosal immunity and allergy. The emerging complexity of these populations along with compounding issues of redundancy and plasticity raise intriguing questions about their precise lineage relationship. Here we review advances in mapping the emergence of these lineages from early lymphoid precursors. We discuss the identification of a common innate lymphoid cell precursor characterized by transient expression of the transcription factor PLZF, and the lineage relationships of innate lymphoid cells with conventional natural killer cells and lymphoid tissue inducer cells. We also review the rapidly growing understanding of the network of transcription factors that direct the development of these lineages.

  11. Precursor polymer compositions comprising polybenzimidazole

    Science.gov (United States)

    Klaehn, John R.; Peterson, Eric S.; Orme, Christopher J.

    2015-07-14

    Stable, high performance polymer compositions including polybenzimidazole (PBI) and a melamine-formaldehyde polymer, such as methylated, poly(melamine-co-formaldehyde), for forming structures such as films, fibers and bulky structures. The polymer compositions may be formed by combining polybenzimidazole with the melamine-formaldehyde polymer to form a precursor. The polybenzimidazole may be reacted and/or intertwined with the melamine-formaldehyde polymer to form the polymer composition. For example, a stable, free-standing film having a thickness of, for example, between about 5 .mu.m and about 30 .mu.m may be formed from the polymer composition. Such films may be used as gas separation membranes and may be submerged into water for extended periods without crazing and cracking. The polymer composition may also be used as a coating on substrates, such as metal and ceramics, or may be used for spinning fibers. Precursors for forming such polymer compositions are also disclosed.

  12. Demonstration of thin film pair distribution function analysis (tfPDF for the study of local structure in amorphous and crystalline thin films

    Directory of Open Access Journals (Sweden)

    Kirsten M. Ø. Jensen

    2015-09-01

    Full Text Available By means of normal-incidence, high-flux and high-energy X-rays, total scattering data for pair distribution function (PDF analysis have been obtained from thin films (tf, suitable for local structure analysis. By using amorphous substrates as support for the films, the standard Rapid Acquisition PDF setup can be applied and the scattering signal from the film can be isolated from the total scattering data through subtraction of an independently measured background signal. No angular corrections to the data are needed, as would be the case for grazing incidence measurements. The `tfPDF' method is illustrated through studies of as-deposited (i.e. amorphous and crystalline FeSb3 films, where the local structure analysis gives insight into the stabilization of the metastable skutterudite FeSb3 phase. The films were prepared by depositing ultra-thin alternating layers of Fe and Sb, which interdiffuse and after annealing crystallize to form the FeSb3 structure. The tfPDF data show that the amorphous precursor phase consists of corner-sharing FeSb6 octahedra with motifs highly resembling the local structure in crystalline FeSb3. Analysis of the amorphous structure allows the prediction of whether the final crystalline product will form the FeSb3 phase with or without excess Sb present. The study thus illustrates how analysis of the local structure in amorphous precursor films can help to understand crystallization processes of metastable phases and opens for a range of new local structure studies of thin films.

  13. Analysis of the tropospheric water distribution during FIRE 2

    Science.gov (United States)

    Westphal, Douglas L.

    1993-01-01

    The Penn State/NCAR mesoscale model, as adapted for use at ARC, was used as a testbed for the development and validation of cloud models for use in General Circulation Models (GCM's). This modeling approach also allows us to intercompare the predictions of the various cloud schemes within the same dynamical framework. The use of the PSU/NCAR mesoscale model also allows us to compare our results with FIRE-II (First International Satellite Cloud Climatology Project Regional Experiment) observations, instead of climate statistics. Though a promising approach, our work to date revealed several difficulties. First, the model by design is limited in spatial coverage and is only run for 12 to 48 hours at a time. Hence the quality of the simulation will depend heavily on the initial conditions. The poor quality of upper-tropospheric measurements of water vapor is well known and the situation is particularly bad for mid-latitude winter since the coupling with the surface is less direct than in summer so that relying on the model to spin-up a reasonable moisture field is not always successful. Though one of the most common atmospheric constituents, water vapor is relatively difficult to measure accurately, especially operationally over large areas. The standard NWS sondes have little sensitivity at the low temperatures where cirrus form and the data from the GOES 6.7 micron channel is difficult to quantify. For this reason, the goals of FIRE Cirrus II included characterizing the three-dimensional distribution of water vapor and clouds. In studying the data from FIRE Cirrus II, it was found that no single special observation technique provides accurate regional distributions of water vapor. The Raman lidar provides accurate measurements, but only at the Hub, for levels up to 10 km, and during nighttime hours. The CLASS sondes are more sensitive to moisture at low temperatures than are the NWS sondes, but the four stations only cover an area of two hundred kilometers on a side

  14. CLUSTER ANALYSIS UKRAINIAN REGIONAL DISTRIBUTION BY LEVEL OF INNOVATION

    Directory of Open Access Journals (Sweden)

    Roman Shchur

    2016-07-01

    Full Text Available   SWOT-analysis of the threats and benefits of innovation development strategy of Ivano-Frankivsk region in the context of financial support was сonducted. Methodical approach to determine of public-private partnerships potential that is tool of innovative economic development financing was identified. Cluster analysis of possibilities of forming public-private partnership in a particular region was carried out. Optimal set of problem areas that require urgent solutions and financial security is defined on the basis of cluster approach. It will help to form practical recommendations for the formation of an effective financial mechanism in the regions of Ukraine. Key words: the mechanism of innovation development financial provision, innovation development, public-private partnerships, cluster analysis, innovative development strategy.

  15. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two......Given a task of designing controller for mobile robots in swarms, one might wonder which distributed control paradigms should be selected. Until now, paradigms of robot controllers have been within either behaviour based control or neural network based control, which have been recognized as two...... mainstreams of controller design for mobile robots. However, in swarm robotics, it is not clear how to determine control paradigms. In this paper we study the two control paradigms with various experiments of swarm aggregation. First, we introduce the two control paradigms for mobile robots. Second, we...

  16. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  17. TCP isoeffect analysis using a heterogeneous distribution of radiosensitivity

    International Nuclear Information System (INIS)

    Carlone, Marco; Wilkins, David; Nyiri, Balazs; Raaphorst, Peter

    2004-01-01

    A formula for the α/β ratio is derived using the heterogeneous (population averaged) tumor control model. This formula is nearly identical to the formula obtained using the homogeneous (individual) tumor control model, but the new formula includes extra terms showing that the α/β ratio, the ratio of the mean value of α divided by the mean value of β that would be observed in a patient population, explicitly depends on the survival level and heterogeneity. The magnitude of this correction is estimated for prostate cancer, and this appears to raise the mean value of the ratio estimate by about 20%. The method also allows investigation of confidence limits for α/β based on a population distribution of radiosensitivity. For a widely heterogeneous population, the upper 95% confidence interval for the α/β ratio can be as high as 7.3 Gy, even though the population mean is between 2.3 and 2.6 Gy

  18. Analysis and Optimization of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    and scheduling policies. In this context, the task of designing such systems is becoming increasingly difficult. The success of new adequate design methods depends on the availability of efficient analysis as well as optimization techniques. In this paper, we present both analysis and optimization approaches...... characteristic to this class of systems: mapping of functionality, the optimization of the access to the communication channel, and the assignment of scheduling policies to processes. Optimization heuristics aiming at producing a schedulable system, with a given amount of resources, are presented....

  19. A data analysis expert system for large established distributed databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  20. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  1. Distributed Robustness Analysis of Interconnected Uncertain Systems Using Chordal Decomposition

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2014-01-01

    Large-scale interconnected uncertain systems commonly have large state and uncertainty dimensions. Aside from the heavy computational cost of performing robust stability analysis in a centralized manner, privacy requirements in the network can also introduce further issues. In this paper, we util...

  2. Channel flow analysis. [velocity distribution throughout blade flow field

    Science.gov (United States)

    Katsanis, T.

    1973-01-01

    The design of a proper blade profile requires calculation of the blade row flow field in order to determine the velocities on the blade surfaces. An analysis theory is presented for several methods used for this calculation and associated computer programs that were developed are discussed.

  3. Morphological analysis of polymer systems with broad particle size distribution

    Czech Academy of Sciences Publication Activity Database

    Šlouf, Miroslav; Ostafinska, Aleksandra; Nevoralová, Martina; Fortelný, Ivan

    2015-01-01

    Roč. 42, April (2015), s. 8-16 ISSN 0142-9418 R&D Projects: GA ČR(CZ) GA14-17921S Institutional support: RVO:61389013 Keywords : polymer blends * morphology * image analysis Subject RIV: JJ - Other Materials Impact factor: 2.350, year: 2015

  4. Silicon dioxide obtained by Polymeric Precursor Method

    International Nuclear Information System (INIS)

    Oliveira, C.T.; Granado, S.R.; Lopes, S.A.; Cavalheiro, A.A.

    2011-01-01

    The Polymeric Precursor Method is able for obtaining several oxide material types with high surface area even obtained in particle form. Several MO 2 oxide types such as titanium, silicon and zirconium ones can be obtained by this methodology. In this work, the synthesis of silicon oxide was monitored by thermal analysis, XRD and surface area analysis in order to demonstrate the influence of the several synthesis and calcining parameters. Surface area values as higher as 370m2/g and increasing in the micropore volume nm were obtained when the material was synthesized by using ethylene glycol as polymerizing agent. XRD analysis showed that the material is amorphous when calcinated at 600°C in despite of the time of calcining, but the material morphology is strongly influenced by the polymeric resin composition. Using Glycerol as polymerizing agent, the pore size increase and the surface area goes down with the increasing in decomposition time, when compared to ethylene glycol. (author)

  5. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  6. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    Science.gov (United States)

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-06

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.

  7. Advanced analysis of metal distributions in human hair

    International Nuclear Information System (INIS)

    Kempson, Ivan M.; Skinner, William M.

    2006-01-01

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

  8. Gaze distribution analysis and saliency prediction across age groups.

    Science.gov (United States)

    Krishna, Onkar; Helo, Andrea; Rämä, Pia; Aizawa, Kiyoharu

    2018-01-01

    Knowledge of the human visual system helps to develop better computational models of visual attention. State-of-the-art models have been developed to mimic the visual attention system of young adults that, however, largely ignore the variations that occur with age. In this paper, we investigated how visual scene processing changes with age and we propose an age-adapted framework that helps to develop a computational model that can predict saliency across different age groups. Our analysis uncovers how the explorativeness of an observer varies with age, how well saliency maps of an age group agree with fixation points of observers from the same or different age groups, and how age influences the center bias tendency. We analyzed the eye movement behavior of 82 observers belonging to four age groups while they explored visual scenes. Explorative- ness was quantified in terms of the entropy of a saliency map, and area under the curve (AUC) metrics was used to quantify the agreement analysis and the center bias tendency. Analysis results were used to develop age adapted saliency models. Our results suggest that the proposed age-adapted saliency model outperforms existing saliency models in predicting the regions of interest across age groups.

  9. Distributed Finite Element Analysis Using a Transputer Network

    Science.gov (United States)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  10. Analysis of fold distributions of segmented clover detectors

    International Nuclear Information System (INIS)

    Bhattacharya, Pintu; Kshetri, Ritesh

    2015-01-01

    We have studied the effect of segmentation on the full energy energy deposition of a gamma-ray through the studies on fold distribution. The response of seven segmented TIGRESS detectors up to an energy of 8 MeV has been studied by utilizing standard sources of 152 Eu, 56,60 Co and a radioactive 11 Be beam with an energy of 16.5 MeV. Experiment was performed at the ISAC-II facility at TRIUMF, using a thick gold target. The β¯ decay of 11 Be (τ 1/2 = 13.81(8) sec) produces high energy gamma-rays up to 7974 keV. A 1 mm thick annular double-sided silicon detector of the BAMBINO detector, was mounted 19.4 mm downstream of the target position and used for detection of the electrons in coincidence with the gamma-rays from the seven TIGRESS detectors. The master trigger allowed data to be collected either in Ge singles mode or with a Ge-Si coincidence condition. Standard sources of 152 Eu and 56,60 Co were also used to obtain low energy data

  11. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    International Nuclear Information System (INIS)

    Frome, E.L.; Watkins, J.P.; Hagemeyer, D.A.

    2009-01-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  12. Analysis of dose distribution in interventionist radiology professionals

    International Nuclear Information System (INIS)

    Mauricio, Claudia L.P.; Silva, Leonardo Peres; Canevaro, Lucia V.; Luz, Eara de Souza

    2005-01-01

    In this work, an evaluation was made of the distribution of dose received by professionals involved in some procedures of Interventional Radiology at hospitals and clinics in Rio de Janeiro, RJ, Brazil. For these measurements thermoluminescent dosemeters (TLD) of LiF: Mg, Ti (TLD100) were used, positioned at different points of the body of professionals: the hands, knees, neck, forehead and chest, inside and outside the lead apron. The measurements were made by procedure and/or a day of work, and the TLD were calibrated in equivalent operating magnitude of personal dose (Hp (d)) at different depths: 0.07 mm, 3 mm and 10 mm. In some procedures, physicians (holders of service and residents) received significant doses. The results show the importance of the appropriate training of these professionals and the use of personal protective equipment (PPE), such as thyroid shield, which is not always used. Based on these evaluations, some suggestions were made in order to optimize the dose values in these procedures and a discussion on the need for additional monitoring points

  13. Distributed support modelling for vertical track dynamic analysis

    Science.gov (United States)

    Blanco, B.; Alonso, A.; Kari, L.; Gil-Negrete, N.; Giménez, J. G.

    2018-04-01

    The finite length nature of rail-pad supports is characterised by a Timoshenko beam element formulation over an elastic foundation, giving rise to the distributed support element. The new element is integrated into a vertical track model, which is solved in frequency and time domain. The developed formulation is obtained by solving the governing equations of a Timoshenko beam for this particular case. The interaction between sleeper and rail via the elastic connection is considered in an analytical, compact and efficient way. The modelling technique results in realistic amplitudes of the 'pinned-pinned' vibration mode and, additionally, it leads to a smooth evolution of the contact force temporal response and to reduced amplitudes of the rail vertical oscillation, as compared to the results from concentrated support models. Simulations are performed for both parametric and sinusoidal roughness excitation. The model of support proposed here is compared with a previous finite length model developed by other authors, coming to the conclusion that the proposed model gives accurate results at a reduced computational cost.

  14. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  15. X-ray scattering study of thermal nanopore templating in hybrid films of organosilicate precursor and reactive four-armed porogen

    International Nuclear Information System (INIS)

    Yoon, Jinhwan; Heo, Kyuyoung; Oh, Weontae; Jin, Kyeong Sik; Jin, Sangwoo; Kim, Jehan; Kim, Kwang-Woo; Chang, Taihyun; Ree, Moonhor

    2006-01-01

    The miscibility and the mechanism for thermal nanopore templating in films prepared from spin-coating and subsequent drying of homogenous solutions of curable polymethylsilsesquioxane dielectric precursor and thermally labile, reactive triethoxysilyl-terminated four-armed poly(ε-caprolactone) porogen were investigated in detail by in situ two-dimensional grazing incidence small-angle x-ray scattering analysis. The dielectric precursor and porogen components in the film were fully miscible. On heating, limited aggregations of the porogen, however, took place in only a small temperature range of 100-140 deg. C as a result of phase separation induced by the competition of the curing and hybridization reactions of the dielectric precursor and porogen; higher porogen loading resulted in relatively large porogen aggregates and a greater size distribution. The developed porogen aggregates underwent thermal firing above 300 deg. C without further growth and movement, and ultimately left their individual footprints in the film as spherical nanopores

  16. Analysis of distribution of PSL intensity recorded in imaging plate

    International Nuclear Information System (INIS)

    Oda, Keiji; Tsukahara, Kazutaka; Tada, Hidenori; Yamauchi, Tomoya

    2006-01-01

    Supplementary experiments and theoretical consideration have been performed about a new method for particle identification with an imaging plate, which was proposed in the previous paper. The imaging plate was exposed to 137 Cs γ-rays, 2 MeV- protons accelerated by a tandem Van de Graaff, X-rays emitted from a tube operated under the condition of 20-70 kV, as well as α- and β-rays. The frequency distribution in PSL intensity in a pixel of 100μm x 100μm was measured and the standard deviation was obtained by fitting to a Gaussian. It was confirmed that the relative standard deviation decreased with the average PSL intensity for every radiation species and that the curves were roughly divided into four groups of α-rays, protons, β-rays and photons. In the second step, these data were analyzed by plotting the square of the relative standard deviation against the average PSL intensity in full-log scale, where the relation should be expressed by a straight line with an slope of -1 provided that the deviation could be dominated only by statistical fluctuation. The data for α- and β-rays deviated from a straight line and approached to each saturated value as the average PSL intensity increased. This saturation was considered to be caused by inhomogeneity in the source intensity. It was also out that the value of interception on full-log plot would have important information about PSL reading efficiency, one of characteristic parameters of imaging plate. (author)

  17. Photoelastic analysis of stress distribution in oral rehabilitation.

    Science.gov (United States)

    Turcio, Karina Helga Leal; Goiato, Marcelo Coelho; Gennari Filho, Humberto; dos Santos, Daniela Micheline

    2009-03-01

    The purpose of this study was to present a literature review about photoelasticity, a laboratory method for evaluation of implants prosthesis behavior. Fixed or removable prostheses function as levers on supporting teeth, allowing forces to cause tooth movement if not carefully planned. Hence, during treatment planning, the dentist must be aware of the biomechanics involved and prevent movement of supporting teeth, decreasing lever-type forces generated by these prosthesis. Photoelastic analysis has great applicability in restorative dentistry as it allows prediction and minimization of biomechanical critical points through modifications in treatment planning.

  18. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  19. Impact analysis of automotive structures with distributed smart material systems

    Science.gov (United States)

    Peelamedu, Saravanan M.; Naganathan, Ganapathy; Buckley, Stephen J.

    1999-06-01

    New class of automobiles has structural skins that are quite different from their current designs. Particularly, new families of composite skins are developed with new injection molding processes. These skins while support the concept of lighter vehicles of the future, are also susceptible to damage upon impact. It is important that their design should be based on a better understanding on the type of impact loads and the resulting strains and damage. It is possible that these skins can be integrally designed with active materials to counter damages. This paper presents a preliminary analysis of a new class of automotive skins, using piezoceramic as a smart material. The main objective is to consider the complex system with, the skin to be modeled as a layered plate structure involving a lightweight material with foam and active materials imbedded on them. To begin with a cantilever beam structure is subjected to a load through piezoceramic and the resulting strain at the active material site is predicted accounting for the material properties, piezoceramic thickness, adhesive thickness including the effect of adhesives. A finite element analysis is carried out to compare experimental work. Further work in this direction would provide an analytical tool that will provide the basis for algorithms to predict and counter impacts on the future class of automobiles.

  20. Analysis of the international distribution of per capita CO2 emissions using the polarization concept

    International Nuclear Information System (INIS)

    Duro, Juan Antonio; Padilla, Emilio

    2008-01-01

    The concept of polarization is linked to the extent that a given distribution leads to the formation of homogeneous groups with opposing interests. This concept, which is basically different from the traditional one of inequality, is related to the level of inherent potential conflict in a distribution. The polarization approach has been widely applied in the analysis of income distribution. The extension of this approach to the analysis of international distribution of CO 2 emissions is quite useful as it gives a potent informative instrument for characterizing the state and evolution of the international distribution of emissions and its possible political consequences in terms of tensions and the probability of achieving agreements. In this paper we analyze the international distribution of per capita CO 2 emissions between 1971 and 2001 through the adaptation of the polarization concept and measures. We find that the most interesting grouped description deriving from the analysis is a two groups' one, which broadly coincide with Annex B and non-Annex B countries of the Kyoto Protocol, which shows the power of polarization analysis for explaining the generation of groups in the real world. The analysis also shows a significant reduction in international polarization in per capita CO 2 emissions between 1971 and 1995, but not much change since 1995, which might indicate that polarized distribution of emission is still one of the important factors leading to difficulties in achieving agreements for reducing global emissions. (author)

  1. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  2. Chromosome aberration analysis based on a beta-binomial distribution

    International Nuclear Information System (INIS)

    Otake, Masanori; Prentice, R.L.

    1983-10-01

    Analyses carried out here generalized on earlier studies of chromosomal aberrations in the populations of Hiroshima and Nagasaki, by allowing extra-binomial variation in aberrant cell counts corresponding to within-subject correlations in cell aberrations. Strong within-subject correlations were detected with corresponding standard errors for the average number of aberrant cells that were often substantially larger than was previously assumed. The extra-binomial variation is accomodated in the analysis in the present report, as described in the section on dose-response models, by using a beta-binomial (B-B) variance structure. It is emphasized that we have generally satisfactory agreement between the observed and the B-B fitted frequencies by city-dose category. The chromosomal aberration data considered here are not extensive enough to allow a precise discrimination between competing dose-response models. A quadratic gamma ray and linear neutron model, however, most closely fits the chromosome data. (author)

  3. Movement of Fuel Ashore: Storage, Capacity, Throughput, and Distribution Analysis

    Science.gov (United States)

    2015-12-01

    in MPEM, as the basis for analysis. In keeping with the spirit of EF21 and 33 seabasing concepts, this approach assumes that all other combat...PLTSQ03TM 2 GCE 1 1 INF BN 1 (SL.R’)CO B 1ST Pl. T SQO 3 TM 3 GCE , 1 INF BN 1 (Sl.FIF)CO B2Ml Pl. T HJS(J) GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 HQ...TM GCE 1 1 INF BN 1 (Sl.R’)CO B2NJ Pl. T SOl 1 TM 1 GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 TM 2 GCE 1 1 INF BN 1 (SURF) CO B2NJ Pl. T SQ) 1 TM

  4. Fault Diagnosis for Electrical Distribution Systems using Structural Analysis

    DEFF Research Database (Denmark)

    Knüppel, Thyge; Blanke, Mogens; Østergaard, Jacob

    2014-01-01

    redundancies in large sets of equations only from the structure (topology) of the equations. A salient feature is automated generation of redundancy relations. The method is indeed feasible in electrical networks where circuit theory and network topology together formulate the constraints that define...... relations (ARR) are likely to change. The algorithms used for diagnosis may need to change accordingly, and finding efficient methods to ARR generation is essential to employ fault-tolerant methods in the grid. Structural analysis (SA) is based on graph-theoretical results, that offer to find analytic...... a structure graph. This paper shows how three-phase networks are modelled and analysed using structural methods, and it extends earlier results by showing how physical faults can be identified such that adequate remedial actions can be taken. The paper illustrates a feasible modelling technique for structural...

  5. Radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  6. Radon as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  7. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  8. Fluorinated Phenylalanine Precursor Resistance in Yeast

    Directory of Open Access Journals (Sweden)

    Ian S. Murdoch

    2018-06-01

    Full Text Available Development of a counter-selection method for phenylalanine auxotrophy could be a useful tool in the repertoire of yeast genetics. Fluorinated and sulfurated precursors of phenylalanine were tested for toxicity in Saccharomyces cerevisiae. One such precursor, 4-fluorophenylpyruvate (FPP, was found to be toxic to several strains from the Saccharomyces and Candida genera. Toxicity was partially dependent on ARO8 and ARO9, and correlated with a strain’s ability to convert FPP into 4-fluorophenylalanine (FPA. Thus, strains with deletions in ARO8 and ARO9, having a mild phenylalanine auxotrophy, could be separated from a culture of wild-type strains using FPP. Tetrad analysis suggests FPP resistance in one strain is due to two genes. Strains resistant to FPA have previously been shown to exhibit increased phenylethanol production. However, FPP resistant isolates did not follow this trend. These results suggest that FPP could effectively be used for counter-selection but not for enhanced phenylethanol production.

  9. Distributional Benefit Analysis of a National Air Quality Rule

    Directory of Open Access Journals (Sweden)

    Jin Huang

    2011-06-01

    Full Text Available Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA must perform environmental justice (EJ reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA’s Heavy Duty Diesel (HDD rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups’ baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well.

  10. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  11. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  12. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

  13. The influence of different matrices on the nature and content of haloacetic acids precursors in ozonized water

    Directory of Open Access Journals (Sweden)

    Molnar Jelena J.

    2012-01-01

    Full Text Available This paper investigates the influence of different matrices (groundwater a realistic natural matrix and commercial humic acid solution a synthetic matrix on the nature and content of haloacetic acid (HAA precursors in ozonized water (0.4 to 3.0 mg O3/mg DOC; pH 6. Natural organic matter (NOM characterization of the natural matrix showed it was largely of hydrophobic character (65% fulvic and 14% humic acids, with the hydrophilic fractions HPIA and HPI-NA at 12% and 9%, respectively. At approximately the same dissolved organic carbon (DOC content of the investigated matrices (~10 mg /L, a greater degree of hydrophobicity was seen in the humic acid solution than in the natural matrix, resulting in a higher content of HAA precursors (559 ± 21 μg/L in the synthetic matrix compared to 309 ± 15 μg/L in the natural matrix. By applying different ozone doses (0.4 to 3.0 mg O3/mg DOC, the DOC content of the studied matrices was reduced by 6-22%, with a maximum process efficacy being achieved with 3.0 mg O3/mg DOC. Ozonation also lead to changes in the NOM structure, i.e. complete oxidation of the humic acid fractions in both investigated matrices. After oxidation, hydrophilic structures dominate the natural water matrix (65%, whereas the synthetic matrix has an equal distribution of hydrophobic and hydrophilic fractions (~50%. Changes in the content and structure of NOM during ozonation resulted in the reduction of the total HAA precursors content (63-85%, using 3.0 mg O3/mg DOC. Detailed analysis of the reactivity of the residual HAA precursor materials shows that ozonation using 3.0 mg O3/mg DOC reduced the reactivity of the NOM fractions in comparison to the raw water. By contrast, HAA precursor material present in the commercial HA solution was transformed after ozonation into other reactive compounds, i.e. precursors which originated from the fulvic acid and hydrophilic fractions. The results of the laboratory testing indicate that the

  14. THE COMPARATIVE ANALYSIS OF TWO DIFFERENT STATISTICAL DISTRIBUTIONS USED TO ESTIMATE THE WIND ENERGY POTENTIAL

    Directory of Open Access Journals (Sweden)

    Mehmet KURBAN

    2007-01-01

    Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.

  15. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  16. Fluorescing macerals from wood precursors

    Energy Technology Data Exchange (ETDEWEB)

    Stout, S A; Bensley, D F

    1987-01-01

    A preliminary investigation into the origin of wood-derived macerals has established the existence of autofluorescent maceral precursors in the secondary xylem of swamp-inhabiting plant species. The optical character and fluorescent properties of microtomed thin-sections of modern woods from the Florida Everglades and Okefenokee Swamp, Georgia are compared to the character and properties of their peatified equivalents from various Everglades and Okefenokee peat horizons and their lignitic equivalents from the Brandon lignite of Vermont and the Trail Ridge lignitic peat from northern Florida. The inherent fluorescence of woody cell walls is believed to be caused by lignin though other cell wall components may contribute. The fluorescence spectra for several wood and cell types had a ..gamma../sub m//sub a//sub x/ of 452 nm and Q value of 0.00. The color as observed in blue light and the spectral geometry as measured in UV light of peatified and lignitic woody cell walls (potential textinites) may change progressively during early coalification. Cell wall-derived maceral material is shown to maintain its fluorescing properties after being converted to a structureless material, perhaps a corpohuminite or humodetrinite precursor. Fluorescing xylem cell contents, such as condensed tannins or essential oils, can maintain the fluorescent character through early coalification. Xylem cell walls and xylem cell contents are shown to provide fluorescing progenitor materials which would not require subsequent infusion with 'lipid' materials to account for their fluorescence as phytoclast material or as macerals in coal. 35 references.

  17. Time–frequency analysis of nonstationary complex magneto-hydro-dynamics in fusion plasma signals using the Choi–Williams distribution

    International Nuclear Information System (INIS)

    Xu, L.Q.; Hu, L.Q.; Chen, K.Y.; Li, E.Z.

    2013-01-01

    Highlights: • Choi–Williams distribution yields excellent time–frequency resolution for discrete signal. • CWD method provides clear time–frequency pictures of EAST and HT-7 fast MHD events. • CWD method has advantages to wavelets transform scalogram and the short-time Fourier transform spectrogram. • We discuss about how to choose the windows and free parameter of CWD method. -- Abstract: The Choi–Williams distribution is applied to the time–frequency analysis of signals describing rapid magneto-hydro-dynamic (MHD) modes and events in tokamak plasmas. A comparison is made with Soft X-ray (SXR) signals as well as Mirnov signal that shows the advantages of the Choi–Williams distribution over both continuous wavelets transform scalogram and the short-time Fourier transform spectrogram. Examples of MHD activities in HT-7 and EAST tokamak are shown, namely the onset of coupling tearing modes, high frequency precursors of sawtooth, and low frequency MHD instabilities in edge localized mode (ELM) free in H mode discharge

  18. Simultaneous analysis of (13)C-glutathione as its dimeric form GSSG and its precursor [1-(13)C]glycine using liquid chromatography/isotope ratio mass spectrometry

    NARCIS (Netherlands)

    Schierbeek, Henk; Rook, Denise; te Braake, Frans W. J.; Dorst, Kristien Y.; Voortman, Gardi; Godin, Jean-Philippe; Fay, Laurent-Bernard; van Goudoever, Johannes B.

    2009-01-01

    Determination of glutathione kinetics using stable isotopes requires accurate measurement of the tracers and tracees. Previously, the precursor and synthesized product were measured with two separate techniques, liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) and gas

  19. Hydrogen distribution analysis for CANDU 6 containment using the GOTHIC containment analysis code

    International Nuclear Information System (INIS)

    Nguyen, T.H.; Collins, W.M.

    1995-01-01

    Hydrogen may be generated in the reactor core by the zircaloy-steam reaction for a postulated loss of coolant accident (LOCA) scenario with loss of emergency core cooling (ECC). It is important to predict hydrogen distribution within containment in order to determine if flammable mixtures exist. This information is required to determine the best locations in containment for the placement of mitigation devices such as igniters and recombiners. For large break loss coolant accidents, hydrogen is released after the break flow has subsided. Following this period of high discharge the flow in the containment building undergoes transition from forced flow to a buoyancy driven flow (particularly when local air coolers (LACS) are not credited). One-dimensional computer codes (lumped parameter) are applicable during the initial period when a high degree of mixing occurs due to the forced flow generated by the break. However, during the post-blowdown phase the assumption of homogeneity becomes less accurate, and it is necessary to employ three-dimensional codes to capture local effects. This is particularly important for purely buoyant flows which may exhibit stratification effects. In the present analysis a three-dimensional model of CANDU 6 containment was constructed with the GOTHIC computer code using a relatively coarse mesh adequate enough to capture the salient features of the flow during the blowdown and hydrogen release periods. A 3D grid representation was employed for that portion of containment in which the primary flow (LOCA and post-LOCA) was deemed to occur. The remainder of containment was represented by lumped nodes. The results of the analysis indicate that flammable concentrations exist for several minutes in the vicinity of the break and in the steam generator enclosure. This is due to the fact that the hydrogen released from the break is primarily directed upwards into the steam generator enclosure due to buoyancy effects. Once hydrogen production ends

  20. Analysis, distribution, and dietary exposure of glyoxal and methylglyoxal in cookies and their relationship with other heat-induced contaminants.

    Science.gov (United States)

    Arribas-Lorenzo, Gema; Morales, Francisco J

    2010-03-10

    Thermal processing of food leads to the formation of dicarbonyls such as glyoxal (GO) and methylglyoxal (MGO), which are potentially harmful because they are precursors of advanced glycation end products (AGEs). GO and MGO formation was examined during the baking process of cookies as cookies are a widely distributed food commodity in Western diets. GO and MGO were chromatographically analyzed after employment of an improved method of derivatization with orthophenylenediamine to produce stable quinoxaline derivatives. Sample extraction, cleanup, and chromatographic conditions were evaluated to provide an in-house validated procedure for GO and MGO analysis in cookies. Quantification limits were set at 1.5 and 2 mg/kg for GO and MGO, respectively, with an average recovery of 103% and a calculated precision lower than 7%. Studies were carried out both on laboratory-scale cookies under controlled conditions and on commercial samples as well. GO and MGO values in commercial cookies ranged from 4.8 to 26.0 mg/kg and from 3.7 to 81.4 mg/kg, respectively. Commercial cookies made from ammonium bicarbonate and fructose showed the highest levels of MGO. Dicarbonyls were rapidly formed on the upper side of the cookie regardless of the shape or thickness of the samples, confirming there was a surface effect. Under controlled baking conditions, the formations of GO and MGO were linearly correlated with baking time. MGO formation was related with acrylamide, a heat-processing contaminant, in commercial cookies, but this relationship was not observed for 5-hydroxymethylfurfural. Dietary exposure of the Spanish population to GO and MGO from cookies was estimated to be 213 and 216 microg/person/day, respectively.

  1. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  2. Hydrokinetic simulations of nanoscopic precursor films in rough channels

    International Nuclear Information System (INIS)

    Chibbaro, S; Biferale, L; Binder, K; Milchev, A; Dimitrov, D; Diotallevi, F; Succi, S

    2009-01-01

    We report on simulations of capillary filling of highly wetting fluids in nanochannels with and without obstacles. We use atomistic (molecular dynamics) and hydrokinetic (lattice Boltzmann; LB) approaches which indicate clear evidence of the formation of thin precursor films, moving ahead of the main capillary front. The dynamics of the precursor films is found to obey a square-root law like that obeyed by the main capillary front, z 2 (t)∝t, although with a larger prefactor, which we find to take the same value for the different geometries (2D–3D) under inspection. The two methods show a quantitative agreement which indicates that the formation and propagation of thin precursors can be handled at a mesoscopic/hydrokinetic level. This can be considered as a validation of the LB method and opens the possibility of using hydrokinetic methods to explore space–time scales and complex geometries of direct experimental relevance. Then, the LB approach is used to study the fluid behaviour in a nanochannel when the precursor film encounters a square obstacle. A complete parametric analysis is performed which suggests that thin-film precursors may have an important influence on the efficiency of nanochannel-coating strategies

  3. Fluid Mechanics of Lean Blowout Precursors in Gas Turbine Combustors

    Directory of Open Access Journals (Sweden)

    T. M. Muruganandam

    2012-03-01

    Full Text Available Understanding of lean blowout (LBO phenomenon, along with the sensing and control strategies could enable the gas turbine combustor designers to design combustors with wider operability regimes. Sensing of precursor events (temporary extinction-reignition events based on chemiluminescence emissions from the combustor, assessing the proximity to LBO and using that data for control of LBO has already been achieved. This work describes the fluid mechanic details of the precursor dynamics and the blowout process based on detailed analysis of near blowout flame behavior, using simultaneous chemiluminescence and droplet scatter observations. The droplet scatter method represents the regions of cold reactants and thus help track unburnt mixtures. During a precursor event, it was observed that the flow pattern changes significantly with a large region of unburnt mixture in the combustor, which subsequently vanishes when a double/single helical vortex structure brings back the hot products back to the inlet of the combustor. This helical pattern is shown to be the characteristic of the next stable mode of flame in the longer combustor, stabilized by double helical vortex breakdown (VBD mode. It is proposed that random heat release fluctuations near blowout causes VBD based stabilization to shift VBD modes, causing the observed precursor dynamics in the combustor. A complete description of the evolution of flame near the blowout limit is presented. The description is consistent with all the earlier observations by the authors about precursor and blowout events.

  4. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Lucchese, R.R.; Montuoro, R.; Grum-Grzhimailo, A.N.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    The analysis of the molecular-frame photoelectron angular distributions (MFPADs) is discussed within the dipole approximation. The general expressions are reviewed and strategies for extracting the maximum amount of information from different types of experimental measurements are considered. The analysis of the N 1s photoionization of NO is given to illustrate the method

  5. Decomposition and Projection Methods for Distributed Robustness Analysis of Interconnected Uncertain Systems

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2013-01-01

    We consider a class of convex feasibility problems where the constraints that describe the feasible set are loosely coupled. These problems arise in robust stability analysis of large, weakly interconnected uncertain systems. To facilitate distributed implementation of robust stability analysis o...

  6. Analysis of the influences of grid-connected PV power system on distribution grids

    Directory of Open Access Journals (Sweden)

    Dumitru Popandron

    2013-12-01

    Full Text Available This paper presents the analysis of producing an electric power of 2.8 MW using a solar photovoltaic plant. The PV will be grid connected to the distribution network. The study is focused on the influences of connecting to the grid of a photovoltaic system, using modern software for analysis, modeling and simulation in power systems.

  7. Modelling earth current precursors in earthquake prediction

    Directory of Open Access Journals (Sweden)

    R. Di Maio

    1997-06-01

    Full Text Available This paper deals with the theory of earth current precursors of earthquake. A dilatancy-diffusion-polarization model is proposed to explain the anomalies of the electric potential, which are observed on the ground surface prior to some earthquakes. The electric polarization is believed to be the electrokinetic effect due to the invasion of fluids into new pores, which are opened inside a stressed-dilated rock body. The time and space variation of the distribution of the electric potential in a layered earth as well as in a faulted half-space is studied in detail. It results that the surface response depends on the underground conductivity distribution and on the relative disposition of the measuring dipole with respect to the buried bipole source. A field procedure based on the use of an areal layout of the recording sites is proposed, in order to obtain the most complete information on the time and space evolution of the precursory phenomena in any given seismic region.

  8. Development of precursors recognition methods in vector signals

    Science.gov (United States)

    Kapralov, V. G.; Elagin, V. V.; Kaveeva, E. G.; Stankevich, L. A.; Dremin, M. M.; Krylov, S. V.; Borovov, A. E.; Harfush, H. A.; Sedov, K. S.

    2017-10-01

    Precursor recognition methods in vector signals of plasma diagnostics are presented. Their requirements and possible options for their development are considered. In particular, the variants of using symbolic regression for building a plasma disruption prediction system are discussed. The initial data preparation using correlation analysis and symbolic regression is discussed. Special attention is paid to the possibility of using algorithms in real time.

  9. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...

  10. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    Science.gov (United States)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  11. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    Science.gov (United States)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  12. Precursors to suicidality and violence on antidepressants

    DEFF Research Database (Denmark)

    Bielefeldt, Andreas Ø; Danborg, Pia B; Gøtzsche, Peter C

    2016-01-01

    OBJECTIVE: To quantify the risk of suicidality and violence when selective serotonin and serotonin-norepinephrine reuptake inhibitors are given to adult healthy volunteers with no signs of a mental disorder. DESIGN: Systematic review and meta-analysis. MAIN OUTCOME MEASURE: Harms related...... to suicidality, hostility, activation events, psychotic events and mood disturbances. SETTING: Published trials identified by searching PubMed and Embase and clinical study reports obtained from the European and UK drug regulators. PARTICIPANTS: Double-blind, placebo-controlled trials in adult healthy volunteers...... that reported on suicidality or violence or precursor events to suicidality or violence. RESULTS: A total of 5787 publications were screened and 130 trials fulfilled our inclusion criteria. The trials were generally uninformative; 97 trials did not report the randomisation method, 75 trials did not report any...

  13. Synthesis and Mechanism of Tetracalcium Phosphate from Nanocrystalline Precursor

    Directory of Open Access Journals (Sweden)

    Jianguo Liao

    2014-01-01

    Full Text Available Tetracalcium phosphate (TTCP, Ca4(PO42O was prepared by the calcination of coprecipitated mixture of nanoscale hydroxyapatite (HA, Ca10(PO46(OH2 and calcium carbonate crystal (CaCO3, followed by cooling in the air or furnace. The effect of calcination temperature on crystal structure and phase composition of the coprecipitation mixture was characterized by transmission electron microscope (TEM, thermal analysis-thermogravimetry (DTA-TG, X-ray diffraction (XRD, Fourier transform-infrared spectroscopy (FT-IR, and Raman spectroscopy (RS. The obtained results indicated that the synthesized mixture consisted of nanoscale HA and CaCO3 with uniform distribution throughout the composite. TTCP was observed in the air quenching samples when the calcination temperature was above 1185°C. With the increase of the calcination temperature, the amount of the intermediate products in the air quenching samples decreased and cannot be detected when calcination temperature reached 1450°C. Unexpectedly, the mixture of HA and calcium oxide was observed in the furnace cooling samples. Clearly, the calcination temperature and cooling methods are critical for the synthesis of high-purity TTCP. The results indicate that the nanosize of precursors can decrease the calcination temperature, and TTCP can be calcinated by low temperature.

  14. Precursor conditions related to Zimbabwe's summer droughts

    Science.gov (United States)

    Nangombe, Shingirai; Madyiwa, Simon; Wang, Jianhong

    2018-01-01

    Despite the increasing severity of droughts and their effects on Zimbabwe's agriculture, there are few tools available for predicting these droughts in advance. Consequently, communities and farmers are more exposed, and policy makers are always ill prepared for such. This study sought to investigate possible cycles and precursor meteorological conditions prior to drought seasons that could be used to predict impending droughts in Zimbabwe. The Single Z-Index was used to identify and grade drought years between 1951 and 2010 according to rainfall severity. Spectral analysis was used to reveal the cycles of droughts for possible use of these cycles for drought prediction. Composite analysis was used to investigate circulation and temperature anomalies associated with severe and extreme drought years. Results indicate that severe droughts are more highly correlated with circulation patterns and embedded weather systems in the Indian Ocean and equatorial Pacific Ocean than any other area. This study identified sea surface temperatures in the average period June to August, geopotential height and wind vector in July to September period, and air temperature in September to November period as precursors that can be used to predict a drought occurrence several months in advance. Therefore, in addition to sea surface temperature, which was identified through previous research for predicting Zimbabwean droughts, the other parameters identified in this study can aid in drought prediction. Drought cycles were established at 20-, 12.5-, 3.2-, and 2.7-year cycles. The spectral peaks, 12.5, 3.2, and 2.7, had a similar timescale with the luni-solar tide, El Niño Southern Oscillation and Quasi Biennial Oscillation, respectively, and hence, occurrence of these phenomena have a possibility of indicating when the next drought might be.

  15. Elements of the tsunami precursors' detection physics

    Science.gov (United States)

    Novik, Oleg; Ruzhin, Yuri; Ershov, Sergey; Volgin, Max; Smirnov, Fedor

    ionosphere from the buoy, balloon and satellite complexes. The balloon and buoy complexes will transmit data to a shore station over satellite link. The frequency ranges and sensitivity thresholds of all of the sensors of the LOAMS will be adapted to the characteristics of expected seismic signals according to the numerical research above. Computational methods and statistical analysis (e.g. seismic changes of coherence of spatially distributed sensors of different nature) of the recorded multidimensional time series will be used for prognostic interpretation. The multilevel recordings will provide a stable noise (e.g. ionosphere Pc pulsations, hard sea, industry) and seismic event detection. An intensive heat flow typical for tectonically active lithosphere zones may be considered as an energy source for advanced modifications of the LOAMS. The latter may be used as a warning system for continental and marine technologies, e.g. a sea bottom geothermal energy production. Indeed, seismic distraction of the nuclear power station Fukushima I demonstrates that similar technology hardly is able to solve the energy problems in seismically active regions. On the other hand, the LOAMS may be considered as a scientific observatory for development of the seaquake/tsunami precursor physics, i.e. seismo-hydro-electromagnetics.

  16. An approach to prospective consequential life cycle assessment and net energy analysis of distributed electricity generation

    International Nuclear Information System (INIS)

    Jones, Christopher; Gilbert, Paul; Raugei, Marco; Mander, Sarah; Leccisi, Enrica

    2017-01-01

    Increasing distributed renewable electricity generation is one of a number of technology pathways available to policy makers to meet environmental and other sustainability goals. Determining the efficacy of such a pathway for a national electricity system implies evaluating whole system change in future scenarios. Life cycle assessment (LCA) and net energy analysis (NEA) are two methodologies suitable for prospective and consequential analysis of energy performance and associated impacts. This paper discusses the benefits and limitations of prospective and consequential LCA and NEA analysis of distributed generation. It concludes that a combined LCA and NEA approach is a valuable tool for decision makers if a number of recommendations are addressed. Static and dynamic temporal allocation are both needed for a fair comparison of distributed renewables with thermal power stations to account for their different impact profiles over time. The trade-offs between comprehensiveness and uncertainty in consequential analysis should be acknowledged, with system boundary expansion and system simulation models limited to those clearly justified by the research goal. The results of this approach are explorative, rather than for accounting purposes; this interpretive remit, and the assumptions in scenarios and system models on which results are contingent, must be clear to end users. - Highlights: • A common LCA and NEA framework for prospective, consequential analysis is discussed. • Approach to combined LCA and NEA of distributed generation scenarios is proposed. • Static and dynamic temporal allocation needed to assess distributed generation uptake.

  17. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  18. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  19. Improving precursor adsorption characteristics in ATR-FTIR spectroscopy with a ZrO{sub 2} nanoparticle coating

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaeseo [Korea Research Institute of Standards and Science, Center for Vacuum Technology (Korea, Republic of); Mun, Jihun [University of Science and Technology, Department of Advanced Device Technology (Korea, Republic of); Shin, Jae-Soo; Kim, Jongho; Park, Hee Jung [Daejeon University, Department of Advanced Materials Engineering (Korea, Republic of); Kang, Sang-Woo, E-mail: swkang@kriss.re.kr [Korea Research Institute of Standards and Science, Center for Vacuum Technology (Korea, Republic of)

    2017-02-15

    Nanoparticles were applied to a crystal surface to increase its precursor adsorption efficiency in an attenuated total reflection Fourier transform infrared (ATR-FTIR) spectrometer. Nanoparticles with varying dispersion stabilities were employed and the resulting precursor adsorption characteristics were assessed. The size of the nanoparticles was <100 nm (TEM). In order to vary the dispersion stability, ZrO{sub 2} nanoparticles were dispersed in aqueous solutions of different pH. The ZrO{sub 2} dispersion solutions were analyzed using scanning electron microscopy (SEM) while particle distribution measurements were analyzed using electrophoretic light scattering (ELS) and dynamic light scattering (DLS) techniques. ZrO{sub 2} nanoparticles dispersed in solutions of pH 3 and 11 exhibited the most stable zeta potentials (≥+30 or ≤−30 mV); these observations were confirmed by SEM analysis and particle distribution measurements. Hexamethyldisilazane (HMDS) was used as a precursor for ATR-FTIR spectroscopy. Consequently, when ZrO{sub 2} nanoparticle solutions with the best dispersion stabilities (pH 3 and 11) were applied to the adsorption crystal surface, the measurement efficiency of ATR-FTIR spectroscopy improved by ∼200 and 300%, respectively.

  20. Nonlinear analysis of field distribution in electric motor with periodicity conditions

    Energy Technology Data Exchange (ETDEWEB)

    Stabrowski, M M; Sikora, J

    1981-01-01

    Numerical analysis of electromagnetic field distribution in linear motion tubular electric motor has been performed with the aid of finite element method. Two Fortran programmes for the solution of DBBF and BF large linear symmetric equation systems have been developed for purposes of this analysis. A new iterative algorithm, taking into account iron nonlinearity and periodicity conditions, has been introduced. Final results of the analysis in the form of induction diagrammes and motor driving force are directly useful for motor designers.

  1. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Grum-Grzhimailo, A.N.; Lucchese, R.R.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    A projection method is developed for extracting the nondipole contribution from the molecular frame photoelectron angular distributions of linear molecules. A corresponding convenient parametric form for the angular distributions is derived. The analysis was performed for the N 1s photoionization of the NO molecule a few eV above the ionization threshold. No detectable nondipole contribution was found for the photon energy of 412 eV

  2. Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems

    OpenAIRE

    Herrera, Manuel; Meniconi, Silvia; Alvisi, Stefano; Izquierdo, Joaquin

    2018-01-01

    This document is intended to be a presentation of the Special Issue “Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems”. The final aim of this Special Issue is to propose a suitable framework supporting insightful hydraulic mechanisms to aid the decision-making processes of water utility managers and practitioners. Its 18 peer-reviewed articles present as varied topics as: water distribution system design, optimization of network perf...

  3. The unequal distribution of unequal pay - An empirical analysis of the gender wage gap in Switzerland

    OpenAIRE

    Dorothe Bonjour; Michael Gerfin

    2001-01-01

    In this paper we analyze the distribution of the gender wage gap. Using microdata for Switzerland we estimate conditional wage distribution functions and find that the total wage gap and its discrimination component are not constant over the range of wages. At low wages an overproportional part of the wage gap is due to discrimination. In a further analysis of specific individuals we examine the wage gap at different quantiles and propose a new measure to assess equal earnings opportunities. ...

  4. Vibration analysis of continuous maglev guideways with a moving distributed load model

    International Nuclear Information System (INIS)

    Teng, N G; Qiao, B P

    2008-01-01

    A model of moving distributed load with a constant speed is established for vertical vibration analysis of a continuous guideway in maglev transportation system. The guideway is considered as a continuous structural system and the action of maglev vehicles on guideways is considered as a moving distributed load. Vibration of the continuous guideways used in Shanghai maglev line is analyzed with this model. The factors that affect the vibration of the guideways, such as speeds, guideway's spans, frequency and damping, are discussed

  5. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  6. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  7. Determinants of the distribution and concentration of biogas production in Germany. A spatial econometric analysis

    International Nuclear Information System (INIS)

    Scholz, Lukas

    2015-01-01

    The biogas production in Germany is characterized by a heterogeneous distribution and the formation of regional centers. In the present study the determinants of the spatial distribution and concentration are analyzed with methods of spatial statistics and spatial econometrics. In addition to the consideration of ''classic'' site factors of agricultural production, the analysis here focuses on the possible relevance of agglomeration effects. The results of the work contribute to a better understanding of the regional distribution and concentration of the biogas production in Germany. [de

  8. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Science.gov (United States)

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  9. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  10. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  11. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  12. Geographic distribution of suicide and railway suicide in Belgium, 2008-2013: a principal component analysis.

    Science.gov (United States)

    Strale, Mathieu; Krysinska, Karolina; Overmeiren, Gaëtan Van; Andriessen, Karl

    2017-06-01

    This study investigated the geographic distribution of suicide and railway suicide in Belgium over 2008--2013 on local (i.e., district or arrondissement) level. There were differences in the regional distribution of suicide and railway suicides in Belgium over the study period. Principal component analysis identified three groups of correlations among population variables and socio-economic indicators, such as population density, unemployment, and age group distribution, on two components that helped explaining the variance of railway suicide at a local (arrondissement) level. This information is of particular importance to prevent suicides in high-risk areas on the Belgian railway network.

  13. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Directory of Open Access Journals (Sweden)

    Jeff Alstott

    Full Text Available Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  14. Development of neural network for analysis of local power distributions in BWR fuel bundles

    International Nuclear Information System (INIS)

    Tanabe, Akira; Yamamoto, Toru; Shinfuku, Kimihiro; Nakamae, Takuji.

    1993-01-01

    A neural network model has been developed to learn the local power distributions in a BWR fuel bundle. A two layers neural network with total 128 elements is used for this model. The neural network learns 33 cases of local power peaking factors of fuel rods with given enrichment distribution as the teacher signals, which were calculated by a fuel bundle nuclear analysis code based on precise physical models. This neural network model studied well the teacher signals within 1 % error. It is also able to calculate the local power distributions within several % error for the different enrichment distributions from the teacher signals when the average enrichment is close to 2 %. This neural network is simple and the computing speed of this model is 300 times faster than that of the precise nuclear analysis code. This model was applied to survey the enrichment distribution to meet a target local power distribution in a fuel bundle, and the enrichment distribution with flat power shape are obtained within short computing time. (author)

  15. Fluorescence Imaging Analysis of Upstream Regulators and Downstream Targets of STAT3 in Melanoma Precursor Lesions Obtained from Patients Before and After Systemic Low-Dose Interferon-α Treatment

    Directory of Open Access Journals (Sweden)

    Amanda Pfaff Smith

    2003-01-01

    Full Text Available Atypical nevi are the precursors and risk markers of melanoma. Apart from persistently monitoring these nevocytic lesions and resecting them at the earliest signs of clinical changes, there is as yet no systemic clinical treatment available to interfere with their progression to melanoma. To explore clinical treatments that might interfere with and possibly prevent atypical nevus progression, a previous study documented that 3 months systemic low-dose interferon-α (IFN-α treatment of patients with a clinical history of melanoma and numerous atypical nevi, led to inactivation of the STAT1 and STAT3 transcription factors in atypical nevi. Based upon this finding, we initiated a second study to determine whether systemic low-dose IFN-α treatment also impairs the expression of upstream regulators and downstream targets of STAT1 and STAT3 in atypical nevi. Using cyanine dye-conjugated antibodies, fluorescence imaging analysis revealed expression of JAK2, JNK1, AKT1, NF-κB, and IFN-αβ receptor in benign and atypical nevi, and early- and advanced-stage melanomas. To determine possible changes in the level of expression of these molecules in atypical nevi, excised before and after 3 months of systemic low-dose IFN-α treatment, newly designed optical imaging software was used to quantitate the captured fluorescent hybridization signals on a cell-by-cell basis and across an entire nevus section. The results of this analysis did not provide evidence that systemic low-dose IFN-α treatment alters the level of expression of upstream regulators or downstream targets of STAT1 and STAT3.

  16. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  17. Estimation of monthly solar radiation distribution for solar energy system analysis

    International Nuclear Information System (INIS)

    Coskun, C.; Oktay, Z.; Dincer, I.

    2011-01-01

    The concept of probability density frequency, which is successfully used for analyses of wind speed and outdoor temperature distributions, is now modified and proposed for estimating solar radiation distributions for design and analysis of solar energy systems. In this study, global solar radiation distribution is comprehensively analyzed for photovoltaic (PV) panel and thermal collector systems. In this regard, a case study is conducted with actual global solar irradiation data of the last 15 years recorded by the Turkish State Meteorological Service. It is found that intensity of global solar irradiance greatly affects energy and exergy efficiencies and hence the performance of collectors. -- Research highlights: → The first study to apply global solar radiation distribution in solar system analyzes. → The first study showing global solar radiation distribution as a parameter of the solar irradiance intensity. → Time probability intensity frequency and probability power distribution do not have similar distribution patterns for each month. → There is no relation between the distribution of annual time lapse and solar energy with the intensity of solar irradiance.

  18. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  19. Preparación y caracterización de la zeolita MCM-22 y de su precursor laminar

    Directory of Open Access Journals (Sweden)

    Pergher Sibele B. C.

    2003-01-01

    Full Text Available The layered precursor of MCM-22 was prepared with different Si/Al ratios: 15, 25, 50, 100 and ¥. Upon heat treatment these precursors form MCM-22 zeolite. Both layered precursor and MCM-22 zeolite were characterized by several techniques: Chemical Analysis by Atomic Absorption Spectroscopy (AAS, X-Ray Diffraction (XRD, Thermo-gravimetric Analysis (TGA, Pore Analysis by N2 and Ar adsorption, Scanning Electron Microscopy (SEM, Infrared Spectroscopy (IR and Temperature Programmed Desorption of ammonium (TPD.

  20. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  1. A density distribution algorithm for bone incorporating local orthotropy, modal analysis and theories of cellular solids.

    Science.gov (United States)

    Impelluso, Thomas J

    2003-06-01

    An algorithm for bone remodeling is presented which allows for both a redistribution of density and a continuous change of principal material directions for the orthotropic material properties of bone. It employs a modal analysis to add density for growth and a local effective strain based analysis to redistribute density. General re-distribution functions are presented. The model utilizes theories of cellular solids to relate density and strength. The code predicts the same general density distributions and local orthotropy as observed in reality.

  2. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  3. The Analysis of the Strength, Distribution and Direction for the EEG Phase Synchronization by Musical Stimulus

    Science.gov (United States)

    Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko

    In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.

  4. Laser microdissection of sensory organ precursor cells of Drosophila microchaetes.

    Directory of Open Access Journals (Sweden)

    Eulalie Buffin

    Full Text Available BACKGROUND: In Drosophila, each external sensory organ originates from the division of a unique precursor cell (the sensory organ precursor cell or SOP. Each SOP is specified from a cluster of equivalent cells, called a proneural cluster, all of them competent to become SOP. Although, it is well known how SOP cells are selected from proneural clusters, little is known about the downstream genes that are regulated during SOP fate specification. METHODOLOGY/PRINCIPAL FINDINGS: In order to better understand the mechanism involved in the specification of these precursor cells, we combined laser microdissection, toisolate SOP cells, with transcriptome analysis, to study their RNA profile. Using this procedure, we found that genes that exhibit a 2-fold or greater expression in SOPs versus epithelial cells were mainly associated with Gene Ontology (GO terms related with cell fate determination and sensory organ specification. Furthermore, we found that several genes such as pebbled/hindsight, scabrous, miranda, senseless, or cut, known to be expressed in SOP cells by independent procedures, are particularly detected in laser microdissected SOP cells rather than in epithelial cells. CONCLUSIONS/SIGNIFICANCE: These results confirm the feasibility and the specificity of our laser microdissection based procedure. We anticipate that this analysis will give new insight into the selection and specification of neural precursor cells.

  5. Rapid synthesis of macrocycles from diol precursors

    DEFF Research Database (Denmark)

    Wingstrand, Magnus; Madsen, Charlotte Marie; Clausen, Mads Hartvig

    2009-01-01

    A method for the formation of synthetic macrocycles with different ring sizes from diols is presented. Reacting a simple diol precursor with electrophilic reagents leads to a cyclic carbonate, sulfite or phosphate in a single step in 25-60% yield. Converting the cyclization precursor to a bis-ele...

  6. Precursors in photonic crystals - art. no. 618218

    NARCIS (Netherlands)

    Uitham, R.; Hoenders, B. J.; DeLaRue, RM; Viktorovitch, P; Lopez, C; Midrio, M

    2006-01-01

    We derive the Sommerfeld precursor and present the first calculations for the Brillouin precursor that result from the transmission of a pulse through a photonic crystal. The photonic crystal is modelled by a one-dimensional N-layer medium and the pulse is a generic electromagnetic plane wave packet

  7. The Sommerfeld precursor in photonic crystals

    NARCIS (Netherlands)

    Uitham, R; Hoenders, BJ

    2006-01-01

    We calculate the Sommerfeld precursor that results after transmission of a generic electromagnetic plane wave pulse with transverse electric polarization, through a one-dimensional rectangular N-layer photonic crystal with two slabs per layer. The shape of this precursor equals the shape of the

  8. Bioinspired magnetite synthesis via solid precursor phases

    NARCIS (Netherlands)

    Lenders, J.J.M.; Mirabello, G.; Sommerdijk, N.A.J.M.

    2016-01-01

    Living organisms often exploit solid but poorly ordered mineral phases as precursors in the biomineralization of their inorganic body parts. Generally speaking, such precursor-based approaches allow the organisms-without the need of high supersaturation levels-to accumulate significant quantities of

  9. The distribution of InCl sub x compounds in model polymeric LEDs A combined low and high-energy ion beam analysis study

    CERN Document Server

    Reijme, M A; Simons, D P L; Schok, M; Ijzendoorn, L J V; Brongersma, H H; De Voigt, M J A

    2002-01-01

    A combination of low- and high-energy ion beam analysis techniques was used to determine the distribution of indium chloride compounds in model polymeric light-emitting diodes (p-LEDs). Parts of polymeric LEDs (polydialkoxyphenylenevinylene (OC sub 1 C sub 1 sub 0 -PPV) on indium-tin-oxide (ITO) substrates) were exposed to a HCl/Ar flow to simulate the processes occurring during conversion of precursor PPVs and acid treatment of polymers. Samples with variable exposure times as well as pristine samples were studied with Rutherford backscattering spectrometry (RBS), low energy ion scattering (LEIS), X-ray photoelectron spectroscopy (XPS) and particle induced X-ray emission (PIXE). The RBS measurements show that after HCl exposure indium is distributed throughout the OC sub 1 C sub 1 sub 0 -PPV layer. LEIS and XPS measurements indicate that the indium and chlorine are present at the outermost surface of the OC sub 1 C sub 1 sub 0 -PPV layer. PIXE measurements in combination with the RBS data demonstrate that th...

  10. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  11. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  12. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  13. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  14. Real-Time Analysis and Forecasting of Multisite River Flow Using a Distributed Hydrological Model

    Directory of Open Access Journals (Sweden)

    Mingdong Sun

    2014-01-01

    Full Text Available A spatial distributed hydrological forecasting system was developed to promote the analysis of river flow dynamic state in a large basin. The research presented the real-time analysis and forecasting of multisite river flow in the Nakdong River Basin using a distributed hydrological model with radar rainfall forecast data. A real-time calibration algorithm of hydrological distributed model was proposed to investigate the particular relationship between the water storage and basin discharge. Demonstrate the approach of simulating multisite river flow using a distributed hydrological model couple with real-time calibration and forecasting of multisite river flow with radar rainfall forecasts data. The hydrographs and results exhibit that calibrated flow simulations are very approximate to the flow observation at all sites and the accuracy of forecasting flow is gradually decreased with lead times extending from 1 hr to 3 hrs. The flow forecasts are lower than the flow observation which is likely caused by the low estimation of radar rainfall forecasts. The research has well demonstrated that the distributed hydrological model is readily applicable for multisite real-time river flow analysis and forecasting in a large basin.

  15. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  16. A New Wind Turbine Generating System Model for Balanced and Unbalanced Distribution Systems Load Flow Analysis

    Directory of Open Access Journals (Sweden)

    Ahmet Koksoy

    2018-03-01

    Full Text Available Wind turbine generating systems (WTGSs, which are conventionally connected to high voltage transmission networks, have frequently been employed as distributed generation units in today’s distribution networks. In practice, the distribution networks always have unbalanced bus voltages and line currents due to uneven distribution of single or double phase loads over three phases and asymmetry of the lines, etc. Accordingly, in this study, for the load flow analysis of the distribution networks, Conventional Fixed speed Induction Generator (CFIG based WTGS, one of the most widely used WTGS types, is modelled under unbalanced voltage conditions. The Developed model has active and reactive power expressions in terms of induction machine impedance parameters, terminal voltages and input power. The validity of the Developed model is confirmed with the experimental results obtained in a test system. The results of the slip calculation based phase-domain model (SCP Model, which was previously proposed in the literature for CFIG based WTGSs under unbalanced voltages, are also given for the comparison. Finally, the Developed model and the SCP model are implemented in the load flow analysis of the IEEE 34 bus test system with the CFIG based WTGSs and unbalanced loads. Thus, it is clearly pointed out that the results of the load flow analysis implemented with both models are very close to each other, and the Developed model is computationally more efficient than the SCP model.

  17. The interrelationships of mathematical precursors in kindergarten.

    Science.gov (United States)

    Cirino, Paul T

    2011-04-01

    This study evaluated the interrelations among cognitive precursors across quantitative, linguistic, and spatial attention domains that have been implicated for math achievement in young children. The dimensionality of the quantity precursors was evaluated in 286 kindergarteners via latent variable techniques, and the contribution of precursors from each domain was established for small sums addition. Results showed a five-factor structure for the quantity precursors, with the major distinction being between nonsymbolic and symbolic tasks. The overall model demonstrated good fit and strong predictive power (R(2)=55%) for addition number combinations. Linguistic and spatial attention domains showed indirect relationships with outcomes, with their effects mediated by symbolic quantity measures. These results have implications for the measurement of mathematical precursors and yield promise for predicting future math performance. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. Evaluation of a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Imae, Toshikazu; Takenaka, Shigeharu; Saotome, Naoya

    2016-01-01

    The purpose of this study was to evaluate a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy (SBRT) using volumetric modulated arc therapy (VMAT). VMAT is capable of acquiring respiratory signals derived from projection images and machine parameters based on machine logs during VMAT delivery. Dose distributions were reconstructed from the respiratory signals and machine parameters in the condition where respiratory signals were without division, divided into 4 and 10 phases. The dose distribution of each respiratory phase was calculated on the planned four-dimensional CT (4DCT). Summation of the dose distributions was carried out using deformable image registration (DIR), and cumulative dose distributions were compared with those of the corresponding plans. Without division, dose differences between cumulative distribution and plan were not significant. In the condition Where respiratory signals were divided, dose differences were observed over dose in cranial region and under dose in caudal region of planning target volume (PTV). Differences between 4 and 10 phases were not significant. The present method Was feasible for evaluating cumulative dose distribution in VMAT-SBRT using 4DCT and DIR. (author)

  19. Use of finite mixture distribution models in the analysis of wind energy in the Canarian Archipelago

    International Nuclear Information System (INIS)

    Carta, Jose Antonio; Ramirez, Penelope

    2007-01-01

    The statistical characteristics of hourly mean wind speed data recorded at 16 weather stations located in the Canarian Archipelago are analyzed in this paper. As a result of this analysis we see that the typical two parameter Weibull wind speed distribution (W-pdf) does not accurately represent all wind regimes observed in that region. However, a Singly Truncated from below Normal Weibull mixture distribution (TNW-pdf) and a two component mixture Weibull distribution (WW-pdf) developed here do provide very good fits for both unimodal and bimodal wind speed frequency distributions observed in that region and offer less relative errors in determining the annual mean wind power density. The parameters of the distributions are estimated using the least squares method, which is resolved in this paper using the Levenberg-Marquardt algorithm. The suitability of the distributions is judged from the probability plot correlation coefficient plot R 2 , adjusted for degrees of freedom. Based on the results obtained, we conclude that the two mixture distributions proposed here provide very flexible models for wind speed studies and can be applied in a widespread manner to represent the wind regimes in the Canarian archipelago and in other regions with similar characteristics. The TNW-pdf takes into account the frequency of null winds, whereas the WW-pdf and W-pdf do not. It can, therefore, better represent wind regimes with high percentages of null wind speeds. However, calculation of the TNW-pdf is markedly slower

  20. Geographic distribution of hospital beds throughout China: a county-level econometric analysis.

    Science.gov (United States)

    Pan, Jay; Shallcross, David

    2016-11-08

    Geographical distribution of healthcare resources is an important dimension of healthcare access. Little work has been published on healthcare resource allocation patterns in China, despite public equity concerns. Using national data from 2043 counties, this paper investigates the geographic distribution of hospital beds at the county level in China. We performed Gini coefficient analysis to measure inequalities and ordinary least squares regression with fixed provincial effects and additional spatial specifications to assess key determinants. We found that provinces in west China have the least equitable resource distribution. We also found that the distribution of hospital beds is highly spatially clustered. Finally, we found that both county-level savings and government revenue show a strong positive relationship with county level hospital bed density. We argue for more widespread use of disaggregated, geographical data in health policy-making in China to support the rational allocation of healthcare resources, thus promoting efficiency and equity.

  1. A MODEL OF HETEROGENEOUS DISTRIBUTED SYSTEM FOR FOREIGN EXCHANGE PORTFOLIO ANALYSIS

    Directory of Open Access Journals (Sweden)

    Dragutin Kermek

    2006-06-01

    Full Text Available The paper investigates the design of heterogeneous distributed system for foreign exchange portfolio analysis. The proposed model includes few separated and dislocated but connected parts through distributed mechanisms. Making system distributed brings new perspectives to performance busting where software based load balancer gets very important role. Desired system should spread over multiple, heterogeneous platforms in order to fulfil open platform goal. Building such a model incorporates different patterns from GOF design patterns, business patterns, J2EE patterns, integration patterns, enterprise patterns, distributed design patterns to Web services patterns. The authors try to find as much as possible appropriate patterns for planned tasks in order to capture best modelling and programming practices.

  2. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  3. Earth Observing System precursor data sets

    Science.gov (United States)

    Mah, Grant R.; Eidenshink, Jeff C.; Sheffield, K. W.; Myers, Jeffrey S.

    1993-08-01

    The Land Processes Distributed Active Archive Center (DAAC) is archiving and processing precursor data from airborne and spaceborne instruments such as the thermal infrared multispectral scanner (TIMS), the NS-001 and thematic mapper simulators (TMS), and the advanced very high resolution radiometer (AVHRR). The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the advanced spaceborne thermal emission and reflection radiometer (ASTER) and the moderate resolution imaging spectrometer (MODIS) flight instruments scheduled to be flown on the EOS-AM spacecraft. Ames Research Center has developed and is flying a MODIS airborne simulator (MAS), which provides coverage in both MODIS and ASTER bands. A simulation of an ASTER data set over Death Valley, California has been constructed using a combination of TMS and TIMS data, along with existing digital elevation models that were used to develop the topographic information. MODIS data sets are being simulated by using MAS for full-band site coverage at high resolution and AVHRR for global coverage at 1 km resolution.

  4. Potential Precursor Compounds for Chlorohydrocarbons Detected in Gale Crater, Mars, by the SAM Instrument Suite on the Curiosity Rover

    Science.gov (United States)

    Miller, Kristen E.; Eigenbrode, Jennifer L.; Freissinet, Caroline; Glavin, Daniel P.; Kotrc, Benjamin; Francois, Pascaline; Summons, Roger E.

    2016-01-01

    The detection of chlorinated organic compounds in near-surface sedimentary rocks by the Sample Analysis at Mars (SAM) instrument suite aboard the Mars Science Laboratory Curiosity rover represents an important step toward characterizing habitable environments on Mars. However, this discovery also raises questions about the identity and source of their precursor compounds and the processes by which they become chlorinated. Here we present the results of analog experiments, conducted under conditions similar to SAM gas chromatography-mass spectrometry analyses, in which we pyrolyzed potential precursor compounds in the presence of various Cl salts and Fe oxides that have been identified in Martian sediments. While chloromethanes could not be unambiguously identified, 1,2-dichloropropane (1,2-DCP), which is one of the chlorinated compounds identified in SAM data, is formed from the chlorination of aliphatic precursors. Additionally, propanol produced more 1,2-DCP than nonfunctionalized aliphatics such as propane or hexanes. Chlorinated benzenes ranging from chlorobenzene to hexachlorobenzene were identified in experiments with benzene carboxylic acids but not with benzene or toluene. Lastly, the distribution of chlorinated benzenes depended on both the substrate species and the nature and concentration of the Cl salt. Ca and Mg perchlorate, both of which release O2 in addition to Cl2 and HCl upon pyrolysis, formed less chlorobenzene relative to the sum of all chlorinated benzenes than in experiments with ferric chloride. FeCl3, a Lewis acid, catalyzes chlorination but does not aid combustion. Accordingly, both the precursor chemistry and sample mineralogy exert important controls on the distribution of chlorinated organics.

  5. Distributed activation energy model for kinetic analysis of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.; Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry

    2003-07-01

    Based on the new analysis of distributed activation energy model, a bicentral distribution model was introduced to the analysis of multi-stage hydropyrolysis of coal. The hydropyrolysis for linear temperature programming with and without holding stage were mathematically described and the corresponding kinetic expressions were achieved. Based on the kinetics, the hydropyrolysis (HyPr) and multi-stage hydropyrolysis (MHyPr) of Xundian brown coal was simulated. The results shows that both Mo catalyst and 2-stage holding can lower the apparent activation energy of hydropyrolysis and make activation energy distribution become narrow. Besides, there exists an optimum Mo loading of 0.2% for HyPy of Xundian lignite. 10 refs.

  6. Analysis of the melanin distribution in different ethnic groups by in vivo laser scanning microscopy

    International Nuclear Information System (INIS)

    Antoniou, C; Lademann, J; Richter, H; Patzelt, A; Sterry, W; Astner, S; Zastrow, L; Koch, S

    2009-01-01

    The aim of this study was to determine whether Laser scanning confocal microscopy (LSM) is able to visualize differences in melanin content and distribution in different Skin Phototypes. The investigations were carried out on six healthy volunteers with Skin Phototypes II, IV, and VI. Representative skin samples of Skin Phototypes II, V, and VI were obtained for histological analysis from remaining tissue of skin grafts and were used for LSM-pathologic correlation. LSM evaluation showed significant differences in melanin distribution in Skin Phototypes II, IV, and VI, respectively. Based on the differences in overall reflectivity and image brightness, a visual evaluation scheme showed increasing brightness of the basal and suprabasal layers with increasing Skin Phototypes. The findings correlated well with histological analysis. The results demonstrate that LSM may serve as a promising adjunctive tool for real time assessment of melanin content and distribution in human skin, with numerous clinical applications and therapeutic and preventive implications

  7. Strategic Sequencing for State Distributed PV Policies: A Quantitative Analysis of Policy Impacts and Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Doris, E.; Krasko, V.A.

    2012-10-01

    State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

  8. Catalytic Hydrodechlorination of Trichlorobenzenes with Pd(PhenCl2 as Catalyst Precursor

    Directory of Open Access Journals (Sweden)

    Guanlin Zhang

    2015-01-01

    Full Text Available We reported the catalytic hydrodechlorination (HDC of trichlorobenzenes by an organometallic compound Pd(PhenCl2 as a catalyst precursor. The catalyst precursor was prepared by chemical coordination reaction and characterized by FTIR and 1H NMR techniques. The HDC performance of Pd(PhenCl2 as catalyst precursor was evaluated on 1,2,3-, 1,2,4-, and 1,3,5-trichlorobenzenes (TCBs. All TCBs could be converted to dechlorination products with high conversion. Products distribution was closely related with the substrate structures and C-Cl bond energies. A reasonable reaction mechanism was also proposed.

  9. Identification of a novel biomarker candidate, a 4.8-kDa peptide fragment from a neurosecretory protein VGF precursor, by proteomic analysis of cerebrospinal fluid from children with acute encephalopathy using SELDI-TOF-MS

    Directory of Open Access Journals (Sweden)

    Fujino Osamu

    2011-08-01

    Full Text Available Abstract Background Acute encephalopathy includes rapid deterioration and has a poor prognosis. Early intervention is essential to prevent progression of the disease and subsequent neurologic complications. However, in the acute period, true encephalopathy cannot easily be differentiated from febrile seizures, especially febrile seizures of the complex type. Thus, an early diagnostic marker has been sought in order to enable early intervention. The purpose of this study was to identify a novel marker candidate protein differentially expressed in the cerebrospinal fluid (CSF of children with encephalopathy using proteomic analysis. Methods For detection of biomarkers, CSF samples were obtained from 13 children with acute encephalopathy and 42 children with febrile seizure. Mass spectral data were generated by surface-enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI-TOF MS technology, which is currently applied in many fields of biological and medical sciences. Diagnosis was made by at least two pediatric neurologists based on the clinical findings and routine examinations. All specimens were collected for diagnostic tests and the remaining portion of the specimens were used for the SELDI-TOF MS investigations. Results In experiment 1, CSF from patients with febrile seizures (n = 28, patients with encephalopathy (n = 8 (including influenza encephalopathy (n = 3, encephalopathy due to rotavirus (n = 1, human herpes virus 6 (n = 1 were used for the SELDI analysis. In experiment 2, SELDI analysis was performed on CSF from a second set of febrile seizure patients (n = 14 and encephalopathy patients (n = 5. We found that the peak with an m/z of 4810 contributed the most to the separation of the two groups. After purification and identification of the 4.8-kDa protein, a 4.8-kDa proteolytic peptide fragment from the neurosecretory protein VGF precursor (VGF4.8 was identified as a novel biomarker for encephalopathy. Conclusions

  10. Genome-wide association scan meta-analysis identifies three loci influencing adiposity and fat distribution

    NARCIS (Netherlands)

    C.M. Lindgren (Cecilia); I.M. Heid (Iris); J.C. Randall (Joshua); C. Lamina (Claudia); V. Steinthorsdottir (Valgerdur); L. Qi (Lu); E.K. Speliotes (Elizabeth); G. Thorleifsson (Gudmar); C.J. Willer (Cristen); B.M. Herrera (Blanca); A.U. Jackson (Anne); N. Lim (Noha); P. Scheet (Paul); N. Soranzo (Nicole); N. Amin (Najaf); Y.S. Aulchenko (Yurii); J.C. Chambers (John); A. Drong (Alexander); J. Luan; H.N. Lyon (Helen); F. Rivadeneira Ramirez (Fernando); S. Sanna (Serena); N.J. Timpson (Nicholas); M.C. Zillikens (Carola); H.Z. Jing; P. Almgren (Peter); S. Bandinelli (Stefania); A.J. Bennett (Amanda); R.N. Bergman (Richard); L.L. Bonnycastle (Lori); S. Bumpstead (Suzannah); S.J. Chanock (Stephen); L. Cherkas (Lynn); P.S. Chines (Peter); L. Coin (Lachlan); C. Cooper (Charles); G. Crawford (Gabe); A. Doering (Angela); A. Dominiczak (Anna); A.S.F. Doney (Alex); S. Ebrahim (Shanil); P. Elliott (Paul); M.R. Erdos (Michael); K. Estrada Gil (Karol); L. Ferrucci (Luigi); G. Fischer (Guido); N.G. Forouhi (Nita); C. Gieger (Christian); H. Grallert (Harald); C.J. Groves (Christopher); S.M. Grundy (Scott); C. Guiducci (Candace); D. Hadley (David); A. Hamsten (Anders); A.S. Havulinna (Aki); A. Hofman (Albert); R. Holle (Rolf); J.W. Holloway (John); T. Illig (Thomas); B. Isomaa (Bo); L.C. Jacobs (Leonie); K. Jameson (Karen); P. Jousilahti (Pekka); F. Karpe (Fredrik); J. Kuusisto (Johanna); J. Laitinen (Jaana); G.M. Lathrop (Mark); D.A. Lawlor (Debbie); M. Mangino (Massimo); W.L. McArdle (Wendy); T. Meitinger (Thomas); M.A. Morken (Mario); A.P. Morris (Andrew); P. Munroe (Patricia); N. Narisu (Narisu); A. Nordström (Anna); B.A. Oostra (Ben); C.N.A. Palmer (Colin); F. Payne (Felicity); J. Peden (John); I. Prokopenko (Inga); F. Renström (Frida); A. Ruokonen (Aimo); V. Salomaa (Veikko); M.S. Sandhu (Manjinder); L.J. Scott (Laura); A. Scuteri (Angelo); K. Silander (Kaisa); K. Song (Kijoung); X. Yuan (Xin); H.M. Stringham (Heather); A.J. Swift (Amy); T. Tuomi (Tiinamaija); M. Uda (Manuela); P. Vollenweider (Peter); G. Waeber (Gérard); C. Wallace (Chris); G.B. Walters (Bragi); M.N. Weedon (Michael); J.C.M. Witteman (Jacqueline); C. Zhang (Cuilin); M. Caulfield (Mark); F.S. Collins (Francis); G.D. Smith; I.N.M. Day (Ian); P.W. Franks (Paul); A.T. Hattersley (Andrew); F.B. Hu (Frank); M.-R. Jarvelin (Marjo-Riitta); A. Kong (Augustine); J.S. Kooner (Jaspal); M. Laakso (Markku); E. Lakatta (Edward); V. Mooser (Vincent); L. Peltonen (Leena Johanna); N.J. Samani (Nilesh); T.D. Spector (Timothy); D.P. Strachan (David); T. Tanaka (Toshiko); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); P. Tikka-Kleemola (Päivi); N.J. Wareham (Nick); H. Watkins (Hugh); D. Waterworth (Dawn); M. Boehnke (Michael); P. Deloukas (Panagiotis); L. Groop (Leif); D.J. Hunter (David); U. Thorsteinsdottir (Unnur); D. Schlessinger (David); H.E. Wichmann (Erich); T.M. Frayling (Timothy); G.R. Abecasis (Gonçalo); J.N. Hirschhorn (Joel); R.J.F. Loos (Ruth); J-A. Zwart (John-Anker); K.L. Mohlke (Karen); I.E. Barroso (Inês); M.I. McCarthy (Mark)

    2009-01-01

    textabstractTo identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the

  11. Scenario analysis to account for photovoltaic generation uncertainty in distribution grid reconfiguration

    DEFF Research Database (Denmark)

    Chittur Ramaswamy, Parvathy; Deconinck, Geert; Pillai, Jayakrishnan Radhakrishna

    2013-01-01

    This paper considers hourly reconfiguration of a low voltage distribution network with the objectives of minimizing power loss and voltage deviation. The uncertainty in photovoltaic (PV) generation which in turn will affect the optimum configuration is tackled with the help of scenario analysis. ......-dominated solutions, demonstrating their trade-offs. Finally, the best compromise solution can be selected depending on the decision maker's requirement....

  12. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  13. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  14. Using Raman spectroscopic imaging for non-destructive analysis of filler distribution in chalk filled polypropylene

    DEFF Research Database (Denmark)

    Boros, Evelin; Porse, Peter Bak; Nielsen, Inga

    2016-01-01

    A feasibility study on using Raman spectral imaging for visualization and analysis of filler distribution in chalk filled poly-propylene samples has been carried out. The spectral images were acquired using a Raman spectrometer with 785 nm light source.Eight injection-molded samples with concentr...

  15. An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

    Science.gov (United States)

    Attali, Yigal

    2010-01-01

    Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

  16. Stability analysis of DC microgrids with constant power load under distributed control methods

    DEFF Research Database (Denmark)

    Liu, Zhangjie; Su, Mei; Sun, Yao

    2018-01-01

    of distributed controller is investigated. The small-signal model is established to predict the system qualitative behavior around equilibrium. The stability conditions of the system with time delay are derived based on the equivalent linearized model. Additionally, eigenvalue analysis based on inertia theorem...

  17. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  18. AspectKE*:Security Aspects with Program Analysis for Distributed Systems

    DEFF Research Database (Denmark)

    2010-01-01

    AspectKE* is the first distributed AOP language based on a tuple space system. It is designed to enforce security policies to applications containing untrusted processes. One of the key features is the high-level predicates that extract results of static program analysis. These predicates provide...

  19. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  20. Application of «Sensor signal analysis network» complex for distributed, time synchronized analysis of electromagnetic radiation

    Science.gov (United States)

    Mochalov, Vladimir; Mochalova, Anastasia

    2017-10-01

    The paper considers a developing software-hardware complex «Sensor signal analysis network» for distributed and time synchronized analysis of electromagnetic radiations. The areas of application and the main features of the complex are described. An example of application of the complex to monitor natural electromagnetic radiation sources is considered based on the data recorded in VLF range. A generalized functional scheme of stream analysis of signals by a complex functional node is suggested and its application for stream detection of atmospherics, whistlers and tweaks is considered.

  1. Designing Sustainable Systems for Urban Freight Distribution through techniques of Multicriteria Decision Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Muerza, V.; Larrode, E.; Moreno- Jimenez, J.M.

    2016-07-01

    This paper focuses on the analysis and selection of the parameters that have a major influence on the optimization of the urban freight distribution system by using sustainable means of transport, such as electric vehicles. In addition, a procedure has been be studied to identify the alternatives that may exist to establish the best system for urban freight distribution, which suits the stage that is considered using the most appropriate means of transportation available. To do this, it has been used the Analytic Hierarchy Process, one of the tools of multicriteria decision analysis. In order to establish an adequate planning of an urban freight distribution system using electric vehicles three hypotheses are necessary: (i) it is necessary to establish the strategic planning of the distribution process by defining the relative importance of the strategic objectives of the process of distribution of goods in the urban environment, both economically and technically and in social and environmental terms; (ii) it must be established the operational planning that allows the achievement of the strategic objectives with the most optimized allocation of available resources; and (iii) to determine the optimal architecture of the vehicle that best suits the operating conditions in which it will work and ensures optimum energy efficiency in operation. (Author)

  2. Electrical properties of a novel lead alkoxide precursor: Lead glycolate

    International Nuclear Information System (INIS)

    Tangboriboon, Nuchnapa; Pakdeewanishsukho, Kittikhun; Jamieson, Alexander; Sirivat, Anuvat; Wongkasemjit, Sujitra

    2006-01-01

    The reaction of lead acetate trihydrate Pb(CH 3 COO) 2 .3H 2 O and ethylene glycol, using triethylenetetramine (TETA) as a catalyst, provides in one step access to a polymer-like precursor of lead glycolate [-PbOCH 2 CH 2 O-]. On the basis of high-resolution mass spectroscopy, chemical analysis composition, FTIR, 13 C-solid state NMR and TGA, the lead glycolate precursor can be identified as a trimer structure. The FTIR spectrum demonstrates the characteristics of lead glycolate; the peaks at 1086 and 1042 cm -1 can be assigned to the C-O-Pb stretchings. The 13 C-solid state NMR spectrum gives notably only one peak at 68.639 ppm belonging to the ethylene glycol ligand. The phase transformations of lead glycolate and lead acetate trihydrate to lead oxide, their microstructures, and electrical properties were found to vary with increasing temperature. The lead glycolate precursor has superior electrical properties relative to those of lead acetate trihydrate, suggesting that the lead glycolate precursor can possibly be used as a starting material for producing electrical and semiconducting ceramics, viz. ferroelectric, anti-ferroelectric, and piezoelectric materials

  3. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  4. Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

    International Nuclear Information System (INIS)

    Gaite, José

    2010-01-01

    We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions

  5. Comparison of photon correlation spectroscopy with photosedimentation analysis for the determination of aqueous colloid size distributions

    Science.gov (United States)

    Rees, Terry F.

    1990-01-01

    Colloidal materials, dispersed phases with dimensions between 0.001 and 1 μm, are potential transport media for a variety of contaminants in surface and ground water. Characterization of these colloids, and identification of the parameters that control their movement, are necessary before transport simulations can be attempted. Two techniques that can be used to determine the particle-size distribution of colloidal materials suspended in natural waters are compared. Photon correlation Spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS.

  6. Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E.; Hoeschele, M.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  7. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  8. Distributed resistance model for the analysis of wire-wrapped rod bundles

    International Nuclear Information System (INIS)

    Ha, K. S.; Jung, H. Y.; Kwon, Y. M.; Jang, W. P.; Lee, Y. B.

    2003-01-01

    A partial flow blockage within a fuel assembly in liquid metal reactor may result in localized boiling or a failure of the fuel cladding. Thus, the precise analysis for the phenomenon is required for a safe design of LMR. MATRA-LMR code developed by KAERI models the flow distribution in an assembly by using the wire forcing function to consider the effects of wire-wrap spacers, which is important to the analysis for flow blockage. However, the wire forcing function does not have the capabilities of analysis when the flow blockage is occurred. And thus this model was altered to the distributed resistance model and the validation calculation was carried out against to the experiment of FFM 2A

  9. CMS distributed analysis infrastructure and operations: experience with the first LHC data

    International Nuclear Information System (INIS)

    Vaandering, E W

    2011-01-01

    The CMS distributed analysis infrastructure represents a heterogeneous pool of resources distributed across several continents. The resources are harnessed using glite and glidein-based work load management systems (WMS). We provide the operational experience of the analysis workflows using CRAB-based servers interfaced with the underlying WMS. The automatized interaction of the server with the WMS provides a successful analysis workflow. We present the operational experience as well as methods used in CMS to analyze the LHC data. The interaction with CMS Run-registry for Run and luminosity block selections via CRAB is discussed. The variations of different workflows during the LHC data-taking period and the lessons drawn from this experience are also outlined.

  10. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  11. Variable Frame Rate and Length Analysis for Data Compression in Distributed Speech Recognition

    DEFF Research Database (Denmark)

    Kraljevski, Ivan; Tan, Zheng-Hua

    2014-01-01

    This paper addresses the issue of data compression in distributed speech recognition on the basis of a variable frame rate and length analysis method. The method first conducts frame selection by using a posteriori signal-to-noise ratio weighted energy distance to find the right time resolution...... length for steady regions. The method is applied to scalable source coding in distributed speech recognition where the target bitrate is met by adjusting the frame rate. Speech recognition results show that the proposed approach outperforms other compression methods in terms of recognition accuracy...... for noisy speech while achieving higher compression rates....

  12. Mathematical Modeling and Numerical Analysis of Thermal Distribution in Arch Dams considering Solar Radiation Effect

    Science.gov (United States)

    Mirzabozorg, H.; Hariri-Ardebili, M. A.; Shirkhan, M.; Seyed-Kolbadi, S. M.

    2014-01-01

    The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

  13. Cost Benefit and Alternatives Analysis of Distribution Systems with Energy Storage Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Tom; Nagarajan, Adarsh; Baggu, Murali; Bialek, Tom

    2017-06-27

    This paper explores monetized and non-monetized benefits from storage interconnected to distribution system through use cases illustrating potential applications for energy storage in California's electric utility system. This work supports SDG&E in its efforts to quantify, summarize, and compare the cost and benefit streams related to implementation and operation of energy storage on its distribution feeders. This effort develops the cost benefit and alternatives analysis platform, integrated with QSTS feeder simulation capability, and analyzed use cases to explore the cost-benefit of implementation and operation of energy storage for feeder support and market participation.

  14. Vibration analysis of continuous maglev guideways with a moving distributed load model

    Energy Technology Data Exchange (ETDEWEB)

    Teng, N G; Qiao, B P [Department of Civil Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai, 200240 (China)

    2008-02-15

    A model of moving distributed load with a constant speed is established for vertical vibration analysis of a continuous guideway in maglev transportation system. The guideway is considered as a continuous structural system and the action of maglev vehicles on guideways is considered as a moving distributed load. Vibration of the continuous guideways used in Shanghai maglev line is analyzed with this model. The factors that affect the vibration of the guideways, such as speeds, guideway's spans, frequency and damping, are discussed.

  15. Nonlocal approach to the analysis of the stress distribution in granular systems. I. Theoretical framework

    Science.gov (United States)

    Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.

    1998-05-01

    A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.

  16. Frequency distribution analysis of the long-lived beta-activity of air dust

    International Nuclear Information System (INIS)

    Bunzl, K.; Hoetzl, H.; Winkler, R.

    1977-01-01

    In order to compare the average annual beta activities of air dust a frequency distribution analysis of data has been carried out in order to select a representative quantity for the average value of the data group. It was found that the data to be analysed were consistent with a log-normal frequency distribution and therefore calculations were made of, as the representative average, the median of the beta activity of each year as the antilog of the arithmetric mean of the logarithms, log x, of the analytical values x. The 95% confidence limits were also obtained. The quantities thus calculated are summarized in tabular form. (U.K.)

  17. Mathematical modeling and numerical analysis of thermal distribution in arch dams considering solar radiation effect.

    Science.gov (United States)

    Mirzabozorg, H; Hariri-Ardebili, M A; Shirkhan, M; Seyed-Kolbadi, S M

    2014-01-01

    The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams.

  18. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Emma [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McParland, Charles [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Roberts, Ciaran [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  19. Distribution transformer lifetime analysis in the presence of demand response and rooftop PV integration

    Directory of Open Access Journals (Sweden)

    Behi Behnaz

    2017-01-01

    Full Text Available Many distribution transformers have already exceeded half of their expected service life of 35 years in the infrastructure of Western Power, the electric distribution company supplying southwest of Western Australia, Australia. Therefore, it is anticipated that a high investment on transformer replacement happens in the near future. However, high renewable integration and demand response (DR are promising resources to defer the investment on infrastructure upgrade and extend the lifetime of transformers. This paper investigates the impact of rooftop photovoltaic (PV integration and customer engagement through DR on the lifetime of transformers in electric distribution networks. To this aim, first, a time series modelling of load, DR and PV is utilised for each year over a planning period. This load model is applied to a typical distribution transformer for which the hot-spot temperature rise is modelled based on the relevant standard. Using this calculation platform, the loss of life and the actual age of distribution transformer are obtained. Then, various scenarios including different levels of PV penetration and DR contribution are examined, and their impacts on the age of transformer are reported. Finally, the equivalent loss of net present value of distribution transformer is formulated and discussed. This formulation gives major benefits to the distribution network planners for analysing the contribution of PV and DR on lifetime extension of the distribution transformer. In addition, the provided model can be utilised in optimal investment analysis to find the best time for the transformer replacement and the associated cost considering PV penetration and DR. The simulation results show that integration of PV and DR within a feeder can significantly extend the lifetime of transformers.

  20. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues